AI may be taking the mental wheel more than we realize. In a new study from the MIT Media Lab, students who used ChatGPT to write essays exhibited significantly lower brain activity compared to those working without it.
As AI tools like ChatGPT become a fixture in classrooms and workplaces, a critical question is emerging: How do they affect our minds? The MIT Media Lab research offers an early and sobering clue. And while the findings do not prove that AI makes people less intelligent, they do raise important questions about how timing, repetition and overuse may reduce our cognitive effort.
“We don’t know yet what the right balance is,” said Nataliya Kosmyna, the study’s lead author, in an interview with IBM Think. “But this is a strong signal that we need to better understand when and how we introduce these tools.”
Industry newsletter
Get curated insights on the most important—and intriguing—AI news. Subscribe to our weekly Think newsletter. See the IBM Privacy Statement.
Your subscription will be delivered in English. You will find an unsubscribe link in every newsletter. You can manage your subscriptions or unsubscribe here. Refer to our IBM Privacy Statement for more information.
Kosmyna’s team recruited 54 university students from the Boston area and outfitted them with electroencephalography (EEG) caps, which measure electrical activity in the brain. The goal was to measure real-time fluctuations as participants wrote short essays using one of three tools: ChatGPT, a stripped-down Google search engine or no tool at all. Each student participated in multiple writing sessions, returning to the lab on different days.
The researchers were not testing knowledge or grading essays for quality. Instead, they were measuring what neuroscientists call “neural connectivity,” which refers to the degree to which different parts of the brain communicate with one another during a task.
“We weren’t looking at intelligence,” Kosmyna said. “This isn’t about being smart or not smart. We just wanted to see what happens in the brain when you use an LLM to complete a cognitive task.”
What they found was striking. When students used ChatGPT, their brains showed lower connectivity across key regions associated with active thinking and memory. When students worked without any tools, relying solely on their knowledge, their brains exhibited more cross-regional communication.
The experiment included a twist. In the final session, some participants were switched into new groups. A student who had used ChatGPT in earlier sessions would be asked to write without it, and vice versa.
That is when things got even more interesting. “We found that the order in which students used the tools actually mattered,” Kosmyna said. “If they started out using ChatGPT and then were asked to write on their own, their neural engagement was lower than if they had started without tools and only later used the AI.”
That difference, while based on a small sample, hints at something more profound: that relying on AI too early in life may interrupt cognitive processes that would otherwise build over time. The phenomenon echoes earlier research on cognitive offloading, the tendency for people to store information in external tools, such as smartphones or GPS systems. A 2011 paper described how people are less likely to remember facts when they know that they can retrieve those facts later, via search engines. In other words, when we trust a tool to remember for us, we stop trying. And the MIT study suggests that a similar dynamic may apply to writing and reasoning.
“Logically, we could expect there would be differences when you use tools versus when you don’t,” Kosmyna said. “And that’s what we observed. It is different.”
To test retention, the researchers asked students to revisit a previously assigned essay topic from an earlier part of the study and try to recreate the essay they had written. Some had originally written the piece using ChatGPT, while others had not. But this time, everyone had to work from memory.
The group that had started with AI struggled more. “We didn’t frame it as a memory test,” Kosmyna said. “But we did see real differences in how well they remembered their earlier arguments, and how much of the original material they could recapture.”
It was not just that the AI group performed worse. They also seemed less invested in their own writing.
“We started wondering about ownership,” she said. “If you used a tool to generate the content, do you really feel like it’s yours? Do you remember it in the same way?”
This question of what it means to learn when a tool does the heavy lifting has become a growing concern in education circles. While many teachers have embraced AI for brainstorming or language support, others worry that its ease of use may short-circuit deeper learning.
Kosmyna’s findings stop short of declaring harm, though they do suggest that timing may be critical. When students began the study by writing on their own and then used ChatGPT, their brains remained more active, even during AI-assisted sessions. “It’s possible that building that foundation first helped preserve engagement,” she said. “We don’t know for sure, but it points to the idea that when we introduce these tools really matters.”
Though the study focused on college students, its implications are far broader. In a world where AI is rapidly embedding itself into productivity tools, search engines, email and coding platforms, assessing the risks of over-dependence are no longer hypothetical.
Kosmyna is already expanding the research into other domains, including software engineering. Though she declined to share preliminary results before peer review, she confirmed that similar patterns are emerging. “The next study is already done,” she said. “And we’re seeing some of the same trends.”
Other studies suggest that there may be cognitive consequences from automation. Research from University College London has shown that overuse of GPS can shrink the hippocampus, the part of the brain responsible for spatial memory and learning.
Kosmyna’s concern is not that AI will make people dumber. It is that we do not yet understand how to use it well. “AI is a powerful tool, and it’s here to stay,” she said. “But we’re integrating it very fast, without the data to guide how and when it should be introduced.”
She believes that children may be particularly vulnerable, not because of ChatGPT’s content, but because their brains are still developing. “There’s a difference between giving a 19-year-old an AI tutor and giving one to a 9-year-old,” she said. “But right now, we’re not distinguishing between those scenarios.”
As schools experiment with AI tutors and adaptive learning systems, Kosmyna urges caution. “We need to include teachers in this discussion,” she said. “But we also need to support them with real data, not just assumptions.”
That support may need to include protocols that guide when educators should encourage students to struggle on their own and when they should allow them to use AI assistance. Without that scaffolding, the risk is a generation of learners who know how to prompt a model, but not how to think through a hard problem themselves.
Still, Kosmyna emphasizes that this study is just a start. The sample was small. The task of essay writing was narrow. The next steps for her research team, she said, will involve larger, longitudinal studies across broader populations and a wider range of work types.
“We don’t want to panic,” she said. “We want to ask better questions.”
She paused, then added one final thought.
“If AI is going to be part of how we think, then we need to understand what it’s changing before it changes us.”
Achieve over 90% cost savings with Granite's smaller and open models, designed for developer efficiency. These enterprise-ready models deliver exceptional performance against safety benchmarks and across a wide range of enterprise tasks from cybersecurity to RAG.
Put AI to work in your business with IBM's industry-leading AI expertise and portfolio of solutions at your side.
Reinvent critical workflows and operations by adding AI to maximize experiences, real-time decision-making and business value.