Posted by tartoran 3 hours ago
I would completely agree that if you are already 1x delusional then AI will supercharge that into being 10x delusional real fast.
Granted you could argue access to the internet was already something like a 5x multiplier from baseline anyway with the prevalence of echo chamber communities. But now you can just create your own community with chatbots.
Can one typically determine a user’s timezone in JavaScript without getting permissions? I feel like probably yes?
(I’m not imagining something that would strictly cut the user off, just something that would end messages with a suggestion to go to bed, and saying that it will be there in the morning.)
I think you're totally right that that's a risk for some people, I just hadn't considered it because I view them in exactly the opposite light.
looking at my history recently, Claude's most recent response is literally just "Exactly the right move honestly — that's the whole point."
So someone who likes to talk about themselves will get a conversation all about them. Someone talking about an ex is gonna get a whole pile of discussion about their ex.
... and someone depressed or suicidal, who keeps telling the system their own self-opinion, is going to end up with a conversation that reflects that self-opinion back on them as if it's coming from another mind in a conversation. Which is the opposite of what you want to provide for therapy for those conditions.
I know the Milgram obedience to authority experiments but a computer is not really an authority figure.
I’m not a heavy user of LLMs and I’m not sure how delusional I could be, but I wonder if a lot of these things could be prevented if people could only send like one or two follow up messages per conversation, and if the LLM’s memory was turned off. But then I suppose this would be really bad for the AI companies’ metrics. Not sure how it would impact healthy users’ productivity either. Any thoughts?
But they should probably come with a big warning label that says something to the effect of "IF YOU TALK ABOUT YOURSELF, THE NATURE OF THE MACHINE IS THAT IT WILL COME TO AGREE WITH WHAT YOU SAY."
Father doesn't imply that. What sort of implication is that?
Father implies that, the person who had the delusional spiral was his son, that son could be adult. The title is absolutely correct.
Biologically and relationally, he in fact remains his fathers child.
I also took no such implication from the title? It might be your interpretation, it was not mine.
It seems like the law firm that's filing this bills itself as copyright trolls for AI, https://edelson.com/inside-the-firm/artificial-intelligence/
I am deeply saddened by the passing of Jonathan Gavalas and offer condolences to his family.
We can't safeguard things to the point of uselessness. I'm not even sure there is a safeguard you can put in place for a situation like this other than recommending the crisis line (which Gemini did), and then terminating the conversation (which it did not do). But, in critical mental health situations, sometimes just terminating the conversation can also have negative effects.
Maybe LLMs need sort of a surgeon general's warning "Do not use if you have mental health conditions or are suicidal"?
This is exactly the safeguard.
Terminating the conversation is the only way to go, these things don't have a world model, they don't know what they are doing, there's no way to correctly assess the situation at the model level. No more conversation, that's the only way even if there might be jailbreaks to circumvent for a motivated adversary.
I got into quite a lot of rabbit holes with AI. Most of them were "productive", some of them were not.
80% it will talk you out of delusions or obviously dumb ideas. 20% of the time it will reinforce them
It's like being a wood worker whose only projects are workshop benches and organizational cabinets for the tools you use to build workshop cabinets and benches.
Like, on some level it's a fine hobby, but at some point you want to remember what you actually wanted to build and work on that.