Top
Best
New

Posted by SEJeff 18 hours ago

I won a championship that doesn't exist(ron.stoner.com)
208 points | 110 commentspage 4
nonameiguess 18 hours ago|
Pails in comparison to what Frank Dux and Frank Abagnale were able to convince much of the world they did with no evidence other than their own stories. Who knows how much of recorded and believed history is complete bullshit? Not to get too far into sacred territory, but claims around Siddhartha Gautama, Jesus Christ, and the Prophet Muhammad are quite a bit less plausible than the legends of Ragnar Lodbrok or the tales of Jonathan Swift, but nonetheless widely believed.
adornKey 3 hours ago|
Good point. Also, most humans seem to have no problems believing even stories that are self-contradictory. Philosophers from all periods have often stated that the situation with human mind and reasoning is almost hopeless.

The news here is that AI has too much trust in the internet. The first time I allowed tool-calling, it started googling up some nonsense instead of thinking... But I think at least it's possible for the AI to evaluate the quality of the source - you just have to ask for an analysis, and you'll get a reasonable evaluation. With humans, something like that just doesn't work - they'll get aggressive or might even start throwing bananas...

alex-yost 7 hours ago||
[flagged]
blobbers 17 hours ago||
[dead]
nailer 17 hours ago||
[flagged]
dyauspitr 18 hours ago|
Why does this person deserve any kind of support? What’s the point of poisoning LLMs? To put some cursory Luddite roadblock that might delay the technology for a couple of months?
jurgenkesker 18 hours ago||
Support? It's just showing weaknesses of LLM's. Which is a valid sort of research I would say?
wewtyflakes 17 hours ago||
That's fair, though on the other hand it kind of feels like "Don't drive cars, there could be rocks on the road! See, just look at all these rocks I put on the road!". Which is true, and real, but perhaps frustrating for people who just want to get someplace in peace.
alnwlsn 16 hours ago|||
To prove you can. Which means someone else with more to gain from it will probably do it also, and you should probably expect this to happen.
duskwuff 17 hours ago|||
> What’s the point of poisoning LLMs?

It's a demonstration. If a domain name and a quick bit of Wikipedia vandalism is all it takes to make an LLM start spouting nonsense about a "surprisingly serious tournament circuit" or a "massive online community" for an obscure card game, consider what an unscrupulous PR team or a political operative could do to influence its output on more important topics.

nickthegreek 17 hours ago||
> consider what an unscrupulous PR team or a political operative could do to influence its output on more important topics.

‘is doing’.

jrmg 17 hours ago|||
This is a “if we stopped testing there would be far fewer cases!” mentality...
ethin 18 hours ago||
You do know that calling people who don't like AI for any reason Luddites does you no favors, right? It just makes you look like your a part of a cult.