Top
Best
New

Posted by cainxinth 9/3/2025

MIT Study Finds AI Use Reprograms the Brain, Leading to Cognitive Decline(publichealthpolicyjournal.com)
615 points | 566 commentspage 2
gandalfgeek 9/3/2025|
The coverage of this has been so bad that the authors have had to put up an FAQ[1] on their website, where the first question is the following:

Is it safe to say that LLMs are, in essence, making us "dumber"? No! Please do not use the words like “stupid”, “dumb”, “brain rot”, "harm", "damage", "brain damage", "passivity", "trimming" , "collapse" and so on. It does a huge disservice to this work, as we did not use this vocabulary in the paper, especially if you are a journalist reporting on it.

[1]: https://www.media.mit.edu/projects/your-brain-on-chatgpt/ove...

marcofloriano 9/3/2025|
It's actually so safe to say that such a small study like that can point out clearly the fact. But of course, as it is a very sensitive topic, 'the language' and 'the narrative' should be carefully chosen, or you can be 'banned'. Off course we wont see new studies like that anytime soon.
misswaterfairy 9/3/2025||
I can't say I'm surprised by this. The brain is, figuratively speaking, a muscle. Learning through successes and (especially) failures is hard work, though not without benefit, in that the trials and exercises your brain works through exercises the 'muscle', making it stronger.

Using LLMs to do replace the effort we would've otherwise endured to complete a task short-circuits that exercising function, and I would suggest is potentially addictive because it's a near-instant reward for little work.

It would be interesting to see a longitudinal study on the affect of LLMs, collective attention spans, and academic scores where testing is conducted on pen and paper.

onlyrealcuzzo 9/3/2025|
Sounds bullish for AI.

It's like a drug. You start using it, and think you have super powers, and then you've forgotten how to think, and you need AI just to maybe be as smart as you were before.

Every company will need enterprise AI solutions just to maybe get the same amount of productivity as they got before without it.

jugg1es 9/3/2025|||
This is sad but true.
kjkjadksj 9/3/2025||||
And the pipeline is cooked now with some universities now allowing for AI use. It’s like what cliffnotes did for reading comprehension but over all aspects of life and all domains. What a coming tsunami.
infecto 9/3/2025||
Everyone is different. I don’t have a good grasp on the distribution of HN readers these days but I know for myself as a heavy user of LLMs, I am not sold on this for myself. I am asking more questions than ever. I use it for proof reading and editing. But I can see the risk as a software engineer. I really appreciate tools like cursor, I give it bite size chunks and review. Using tools like Claude code though. It becomes a black box and I no longer feel at the helm of the ship. I could see if you outsourced all thinking to an LLM there can be consequences. That said I am not sold on the paper and suspects it’s mostly hyperbole.
Taek 9/3/2025||
Cognitive decline is a broad term, and a research paper could claim "decline" if even a single cognitive metric loses strength.

When writing was invented, societies started depending on long form memorization less, which is a cognitive "decline". When calculators were invented, societies started depending on mental math less, which is a cognitive "decline".

I'm sure LLMs are doing the same thing. People aren't getting dumber, they are just outsourcing tasks more, so that their brains spend more time on the tasks that can't be outsourced.

yuehhangalt 9/3/2025|||
My concern is more attributed to the tasks that can't or won't be outsourced.

People who maintain a high level of curiosity or a have drive to create things will most assuredly benefit from using AI to outsource work that doesn't support those drives. It has the potential to free up more time for creative endeavors or those that require more deep thinking. Few would argue the benefit there.

Unfortunately, anti-intellectualism is rampant, media literacy is in decline, and a lot of people are content to consume content and not think unless they absolutely have to. Dopamine is a helluva drug.

If LLMs reduce the cognitive effort at work, and the people go home to doom scroll on social media or veg out in front of their streaming media of choice, it seems that we're heading down the path of creating a society of mindless automatons. Idiocracy is cited so often today that I hate to do so myself, but it seems increasingly prescient.

Edit: I also don't think that AI will enable a greater work-life harmony. The pandemic showed that a large number of jobs could effectively be done remotely. However, after the pandemic, there was significant "Return to Office" movement that almost seemed like retribution for believing we could achieve a better balance. Corporations won't pass on the time savings to their employees and enable things like 4-day work weeks. They'll simply expect more productivity from the employees they have.

IAmBroom 9/3/2025||||
Absolutely true.

Also, domesticated dogs show indications of lower intelligence and memory than wolves. They don't have to plan complex strategies to find and kill food, anymore.

Taek 9/3/2025||
The difference between us and dogs is that we DO still need to make a salary. Dogs live in a lap of luxury where their needs are guaranteed to be handled.

But humans need jobs, and jobs need to capture value from society. So we do actually still have to stay sharp, whatever form "sharp" takes.

pessimizer 9/3/2025||
You and dogs have the same job, which is to please the boss. The boss then takes care of you like a child, either with a paycheck (with which you can pay servants to supply your earthly needs), or directly if you're a dog and lack both thumbs and pockets to hold a wallet or a phone. A domestic dog would die left alone in a forest, about two or three weeks after you would.

If you're an entrepreneur, your job is to please the customer and to squeeze your vendors and employees. You still take little to no part in directly taking care of yourself, except as a hobby. Unless you want to be congratulated for wiping your own ass or lifting a fork to your mouth.

infecto 9/3/2025|||
This is super interesting and I had not thought about it like that!
ceejayoz 9/3/2025|||
> I am asking more questions than ever.

Wouldn't that be the expected result here? Less knowledge, more questions?

infecto 9/3/2025|||
That’s one interpretation, but I think there’s a distinction between “asking more questions because I’ve forgotten things” and “asking more questions because I’m exploring further.”

When I use LLMs, it’s less about patching holes in my memory and more about taking an idea a few steps further than I otherwise might. For me it’s expanding the surface area of inquiry, not shrinking it. If the study’s thesis were true in my case, I’d expect to be less curious, not more.

Now that said I also have a healthy dose of skepticism for all output but I find for the general case I can at least explore my thoughts further than what I may have done in the past.

rwnspace 9/3/2025|||
In my personal experience new knowledge tends to beget questions.
xnorswap 9/3/2025||
> I am asking more questions than ever.

I don't have a dog in this fight, but "asking more questions" could be evidence of cognitive decline if you're having to ask more questions than ever!

It's easy to twist evidence to fit biases, which is why I'd hold judgement to better evidence comes through.

IAmBroom 9/3/2025|||
Well, that's certainly a take.

But if I'm teaching a class, and one student keeps asking questions that they feel the material raised, I don't tend to think "brain damage". I think "engaged and interested student".

charlie-83 9/3/2025||||
Not OP but there's a difference between needing to ask more questions and asking more questions because its easier now.

Personally, I find myself often asking AI about things I wouldn't have been bothered to find out about before.

For example I've always these funny little grates on the outside of houses near me and wondered what they are. Googling "little grates outside houses" doesn't help at all. Give AI a vagueish description and it instantly tells you they are old boot scapers.

infecto 9/3/2025||
Haha you nailed it. Walking around and experiencing the world I can now ask a vague question and usually find an answer.

Maybe there is a movie in the back of my head or a song. Typical search engine queries would never find it. I can give super vague references to a LLM and with search enabled get an answer that’s correct often enough.

danenania 9/3/2025||
The ability to keep following the thread and interrogating the answers is also very valuable. You never have to accept an answer you only half understand.
infecto 9/3/2025|||
Fair point, though I think there’s a difference between “questions out of confusion” and “questions out of curiosity.”

If I’m constantly asking “what does this mean again?” that would signal decline. But if I’m asking “what if I combine this with X?” or “what are the tradeoffs of Y?” that feels like the opposite: more engagement, not less.

That’s why I’m skeptical of blanket claims from one study, the lived experience doesn’t map so cleanly.

jennyholzer 9/3/2025||
> In post-task interviews:

> 83.3% of LLM users were unable to quote even one sentence from the essay they had just written.

> In contrast, 88.9% of Search and Brain-only users could quote accurately.

> 0% of LLM users could produce a correct quote, while most Brain-only and Search users could.

Reminds me of my coworkers who have literally no idea what Chat GPT put into their PR from last week.

aurareturn 9/3/2025|
Maybe we should question the value of essays in the ChatGPT world?

Could a person, armed with ChatGPT, come up with a better solution in a real world problem than without ChatGPT? Maybe that's what actually matters.

Ekaros 9/3/2025|||
Can they evaluate if the idea that came up with is better if they do not remember how it was stated? Isn't point of writing actually to formulate down the thoughts in communicable manner. And then possibly to be verified by others.

But how can they discuss any content if even the "writer" does not remember what they wrote.

kibwen 9/3/2025||||
The point of writing essays is not to produce an essay, it's to demonstrate that you understand something well enough to engage with it critically, in addition to being an exercise for critical thinking itself.
abirch 9/3/2025|||
College was transformed from an apprentice style institution of the 1500s to mass produced thing of the early 2000s (where a professor can "teach" 500 students in a class).

I think a return to the apprentice style of institution where people try to create the best real world solution as possible with LLMs, 3D printers, etc. Then use recorded college courses like our grandparents used books.

NiloCK 9/3/2025||
Every augmentation is also an amputation.

Calculators reduced our capabilities in mental and pencil-paper arithmetic. Graphing calculators later reduced our capacity to sketch curves, and in turn, our intuition in working directly with equations themselves. Power tools and electric mixers reduced our grip strength. Cheap long distance plans and electronic messaging reduced our collective abilities in long-form letter writing. The written word decimated the population of bards who could recite Homer from memory.

It's not that there aren't pitfalls and failure modes to watch out for, but the framing as a "general decline" is tired, moralizing, motivated, clickbait.

add-sub-mul-div 9/3/2025|
> Calculators reduced our capabilities in mental and pencil-paper arithmetic.

And now people make bad decisions in their daily life about money etc. Most people can't do the math in their head but they also aren't using their calculator at the grocery store to avoid being taken advantage of. The math doesn't get done.

The lesson isn't that we survived calculators, it's that they did dull us, and our general thinking and creativity are about to get likewise dulled.

colincooke 9/3/2025||
It is worth noting that this study was tbh pretty poorly performed from a psychology/neuroscience perspective and the neuro community was kind of roasting their results as uninterpretable.

Their trial design and interpretation of results is not properly done (i.e. they are making unfair comparison of LLM users to non-LLM users), so they can't really make the kind of claims they are making.

This would not stand up to peer review in it's current form.

I'm also saying this as someone who generally does believe these declines exist, but this is not the evidence it claims to be.

Shank 9/3/2025|
> It is worth noting that this study was tbh pretty poorly performed from a psychology/neuroscience perspective and the neuro community was kind of roasting their results as uninterpretable.

Do you have links or citations to people saying these claims?

colincooke 9/3/2025||
It was a bluesky thread I've since lost (lame I know). This article summarizes the issues well: https://residualinsights.com/all-hype-no-bite-your-brain-on-...

Comes down to: - Self selection bias - Trial design - Dubious intepretations of neural connectivity

ticulatedspline 9/3/2025||
Cognitive offload is nothing new, if you've been around for even a little while you've likely personally experienced it.

just like a muscle will atrophy from disuse skills and cognitive assets, once offloaded, will similarly atrophy. People don't memorize phone numbers, gps gets you where you want to go, your IDE seamlessly helps you along so much you could never code in a text editor, your TI-89 will do most of your math homework, as a manager you direct people to do work and no longer do the work yourself.

We of course never really lower our absolute cognitive load by much, just shift it. each of those points has it's own knowledge base that is needed to use it but sometimes we lose general skills in favor of esoteric skills.

While I may now possess esoteric skills in operating my GPS, setting way-points, saving locations, entering coordinates, if I use it a lot I find I need it to get back to the hotel from just a few miles away even if I've driven the route multiple times. I'm offloading learning the route to the gps. My father on the other hand struggles to use one and if he's away he pays a lot of attention to where he's going and remembers routes better.

Am I dumber than him? with respect to operating the device certainly not but if we both drove separately to a new location and you took GPS from me once I got there I'd certainly look a lot dumber getting lost trying to get back without my mental crutch. I didn't have to remember the route, so I didn't. I offloaded that to the machine, and some people offload a LOT, pretty sure nobody ever drove into a lake because a paper map told them to.

Modern AI is only interesting insofar as it subsumes tasks that until now we would consider fundamental. Reading, writing, basic comprehension. If you let it, AI will take over these things and spoon feed you all you want. Your cognitive abilities in those areas will atrophy and you will be less cognizant of task elements where you've offloaded mental workload to the AI.

And we'll absolutely see more of this, people who are a wiz at using AI, know every app, get things done by reflex. but never learned or completely forgot how to do basic shit, like read a paper, order a salad off a menu in-person or book a flight and it'll be both funny and sad when it happens.

kawfey 9/3/2025||
The "your brain on ChatGPT" is giving the same feel as DARE's "your brain on drugs" campagign, and we now see how that went. It immediately loses any credibility for me.

It wasn't immediately clear what they actually had the subjects do. It seems like they wrote an essay, which...duh? I would bet brain activity would be similar -- if not identical -- as an LLM user if the subjects were asked to have the other cohorts to write their essay.

causal 9/3/2025|
Just look at this comment section - one flawed headline is all it takes to get hundreds of people writing essays about how they totally understand how the brain works and knew it all along.
owisd 9/3/2025||
I was reading The Shallows recently about how the Internet affects your brain, it's from 2009 so a bit out of date re: smartphones & a lot re: LLMs but makes the case that the Internet and hypertext generally as a tool is 'bad' for you cognitively because it puts additional load on your working memory, but offloads tasks from the parts of your brain that are useful for higher-level tasks and abstract thinking, so those more valuable skills atrophy. It contrasts this with a calculator that makes you "smarter" because it does the opposite - frees up your working memory so you have more time to focus on high-level thought. Found it quite striking because it seemed most likely LLMs and Smartphones would fit in the hypertext category and not the calculator category yet calculators is exactly what Sam Altman likes to use as an analogy to LLMs.
variadix 9/3/2025|
Seems obvious. If you don’t use it you lose it. Same thing happened with mental arithmetic, remembering phone numbers, etc. Letting an LLM do your thinking will make you worse at thinking.
More comments...