Top
Best
New

Posted by bookofjoe 15 hours ago

Google AI Overviews cite YouTube more than any medical site for health queries(www.theguardian.com)
368 points | 197 commentspage 3
mikkupikku 14 hours ago|
Don't all real/respectable medical websites basically just say "Go talk to a real doctor, dummy."?

...and then there's WebMD, "oh you've had a cough since yesterday? It's probably terminal lung cancer."

gowld 13 hours ago|
WebMD is a real doctor, I guess. It's got an MD right in the name!
laborcontract 14 hours ago||
Google AI overviews are often bad, yes, but why is youtube as a source necessarily a bad thing? Are these researchers doctors? A close relative is a practicing surgeon and a professor in his field. He watches youtube videos of surgeries practically every day. Doctors from every field well understand that YT is a great way to share their work and discuss w/ others.

Before we get too worked up about the results, just look at the source. It's a SERP ranking aggregator (not linking to them to give them free marketing) that's analyzing only the domains, not the credibility of the content itself.

This report is a nothingburger.

ceejayoz 14 hours ago||
> A close relative is a practicing surgeon and a professor in his field. He watches youtube videos of surgeries practically every day.

A professor in the field can probably go "ok this video is bullshit" a couple minutes in if it's wrong. They can identify a bad surgeon, a dangerous technique, or an edge case that may not be covered.

You and I cannot. Basically, the same problem the general public has with phishing, but even more devastating potential consequences.

raincole 13 hours ago|||
The same can be said for average "medical sites" the Google search gives you anyway.
ceejayoz 13 hours ago|||
It's a lot easier for me to assess the Mayo Clinic's website being legitimate than an individual YouTuber's channel.
fc417fc802 13 hours ago|||
I don't think anyone is talking about "medical sites" but rather medical sites. Indeed "medical sites" are no better than unvetted youtube videos created by "experts".

That said, if (hypothetically) gemini were citing only videos posted by professional physicians or perhaps videos uploaded to the channel of a medical school that would be fine. The present situation is similar to an LLM generating lots of citations to vixra.

laborcontract 13 hours ago|||
Your comment doesn't address my point. The same criticism applies to any medium.
ceejayoz 13 hours ago||
The point is you can't say "an expert finds x useful in their field y" and expect it to always mean "any random idiot will find x useful in field y".
RobotToaster 8 hours ago||
Imagine going onto youtube and finding a video of yourself being operated on lol
citizenpaul 6 hours ago||
Unrelated to this but I was able to get some very accurate health predictions for a cancer victim in my family using gemini and lab test results. I would actually say that other than one Doctor Gemini was more straightforward and honest about how and more importantly WHEN things would progress. Nearly to the day on every point over 6 months.

Pretty much every doctor would only say vague things like everyone is different all cases are different.

I did find this surprising considering I am critical of AI in general. However I think less the AI is good than the doctors simply don't like giving hopeless information. An entirely different problem. Either way the AI was incredibly useful to me for a literal life/death subject I have almost no knowledge about.

qq66 8 hours ago||
I've seen so many outright falsehoods in Google AI overviews that I've stopped reading them. They're either not willing to incur the cost or latency it would take to make them useful.
josefritzishere 12 hours ago||
Google AI cannot be trusted for medical adivice. It has killed before and it will kill again.
PlatoIsADisease 11 hours ago|
Maybe Google, but GPT3 diagnosed a patient that was misdiagnosed by 6 doctors over 2 years. To be fair, 1 out of those 6 doctors should have figured it out. The other 5 were out of their element. Doctor number 7 was married to me and got top 10 most likely diagnosis from GPT3.
Pxtl 13 hours ago||
What's surprising is how poor Google Search's transcript access is to Youtube videos. Like, I'll Google search for statements that I know I heard on Youtube but they just don't appear as results even though the video has automated transcription on it.

I'd assumed they simply didn't feed it properly to Google Search... but they did for Gemini? Maybe just the Search transcripts are heavily downranked or something.

RobotToaster 8 hours ago||
Probably because the majority of medical sites are paywalled.
bjourne 12 hours ago||
Basic problem with Google's AI is that it never says "you can't" or "I don't know". So many times it comes up with plausible-sounding incorrect BS to "how to" questions. E.g., "in a facebook group how do you whitelist posts from certain users?" The answer is "you can't", but AI won't tell you.
ChrisArchitect 13 hours ago||
Related:

Google AI Overviews put people at risk of harm with misleading health advice

https://news.ycombinator.com/item?id=46471527

heliumtera 13 hours ago|
Ohhh, I would make one wild guess: in the upcoming llm world, the highest bidder will have a higher chance of appearing as a citation or suggestion! Welcome to gas town, so much productivity ahead!! For you and the high bidding players interested in taking advantage of you
gdulli 11 hours ago|
Exactly. This is the holy grail of advertising. Seamless and undisclosed. That, and replacing vast amounts of labor, are some of the only uses that justify the level of investment in LLM AI.
More comments...