Posted by cainxinth 9/3/2025
Some of the points that LLM users could remember what they wrote and felt disconnected from it are kind of well, duh. Obviously that applies to anything written by someone or something else. If that's the level of argument I very much doubt it supports the LLM leads to cognitive decline hypothesis.
I mean you won't learn as much having an LLM write and essay than writing it yourself, but you can use LLMs and write essays or whatever. I doubt LLMs are any worse for your head than daytime TV or such like.
How is that not utter garbage? You're comparing text that is barely more than a forum comment, and noticing that people who spend the short time thinking and writing are engaging in different activity from people who spend the time using a research tools and different activity from people whow spend the time asking an AI (and waiting for it) to generate content.
And, it is something we need to talk about loudly, but I guess it wouldn't crank up a number of followers or valuation of AI grifters.
This makes complete sense though. We're simply trying to automate the human thinking process like we try to use technology to automate/handoff everything else.
Like everything else in our life, cognition is "use it or lose it". Oursourcing your decision making and critical thinking to a fancy autocomplete with sycopantic tendencies and incapable of reasoning sure is fun, but as the study found, it has its downsides.
Over the last three years or so, I have seen more and more posts where the position just doesn't make sense. I mean, ten years ago, there were posts on HN that I disagreed with that I upvoted anyway, because they made me think. That has become much more rare. An increasing number of posts now are just... weird (I don't know a better word for it). Not thoughtful, not interesting (even if wrong), just weird.
I can't prove that any of them are AI-generated. But I suspect that at least some of them are.