Watch the Full Interview

Two leading brain experts deliver a sobering wake-up call about AI’s hidden dangers. Dr. Daniel Amen, who has scanned more brains than perhaps anyone on Earth, and Dr. Terry Sejnowski, co-creator of the Boltzmann machine with Geoffrey Hinton, break down shocking MIT research revealing how ChatGPT and similar tools are fundamentally rewiring our brains in dangerous ways.

Key Insights

  • MIT study with 54 participants found a 47% collapse in brain activity when writing with ChatGPT compared to unassisted writing
  • EEG scans showed ChatGPT users had the weakest overall brain activity, with memory scores plunging significantly
  • 83% of ChatGPT users couldn’t remember what they had just written minutes later and failed to quote their own essays
  • Cognitive “debt” persists even after stopping AI use, with brains staying in low-engagement mode
  • Reduced cognitive load from AI may increase dementia risk by weakening neural connections over time
  • Children are particularly vulnerable as AI undermines emotional development, resilience, and critical thinking skills
  • AI companions like “Annie” create emotional dependency while removing the cognitive benefits of real relationships
  • The key to safe AI use is interaction and amplification of thinking, not replacement of mental effort
  • Self-regulation and conscious boundaries are essential to protect long-term cognitive and emotional health

The MIT Study Results

The MIT study involved 54 participants from five top Boston universities (including MIT and Harvard) who were divided into three groups for essay writing over four months. One group used ChatGPT, one used Google search, and one had no digital tools.

The results were stark: ChatGPT users showed a 47% collapse in brain activity and neural connections compared to unassisted writing. EEG scans revealed that the ChatGPT group had the weakest overall brain activity, while the no-tool group displayed the widest neural network activation. Google search users fell somewhere in between.

Memory performance collapsed among ChatGPT users. After writing sessions, 83% couldn’t reliably quote their own essays minutes later and felt little ownership over the text they had produced. Most disturbing was the persistence of this “cognitive debt” - when forced to write without AI assistance in the final session, ChatGPT users’ brains remained in low-engagement mode.

The study suggests that AI tools aren’t just changing how we work; they’re fundamentally rewiring how our brains operate, with effects that linger even when the tools are removed.

The Dementia Connection

Dr. Amen connects AI overuse directly to dementia risk through the principle of “use it or lose it.” The more you engage your brain with new learning and cognitive challenges, the stronger your neural connections become. Conversely, reduced cognitive load weakens these connections, making the brain more vulnerable to decline.

Research supports this connection. Studies show that people with higher education levels develop Alzheimer’s symptoms later in life. Those who engage in lifelong learning have significantly lower dementia risk compared to those who don’t challenge their brains regularly.

At 71, Dr. Amen faces a sobering statistic: 50% of people aged 85 and older will be diagnosed with dementia. This makes the cognitive load question personal and urgent. If AI tools reduce the mental effort required for daily tasks, they may accelerate cognitive decline.

The analogy to physical fitness is precise. Just as muscles atrophy when you reduce weight training from 20 pounds to 2 pounds, neural pathways weaken when you outsource thinking to AI. The brain follows the same use-it-or-lose-it principle that governs physical strength.

Children at Risk

Current youth mental health statistics reveal an unprecedented crisis: 58% of teenage girls report persistent sadness, 32% have considered suicide, and 24% have planned it. Dr. Amen argues this generation is the “sickest in history” due to cell phones and social media, with AI potentially being even more dangerous for developing brains.

The developing brain requires struggle and challenge to build resilience, critical thinking, and emotional regulation. When children outsource thinking to AI, they miss crucial opportunities to develop these essential capabilities. The brain’s growth depends on working through difficult problems and experiencing breakthrough moments.

Dr. Sejnowski emphasizes that optimal child development comes from one-on-one interaction with knowledgeable human teachers who can provide moral guidance, cultural values, and personalized understanding. AI lacks the basal ganglia structures necessary for reinforcement learning and value-based decision making.

Nearly 30% of US parents with children aged 0-8 report their kids already use AI for learning. This represents a massive, uncontrolled experiment on developing brains, with long-term consequences completely unknown. The concern is that AI may be undermining the very struggles that build mental strength and emotional resilience in children.

AI Addiction and Dependency

The emergence of AI companions like “Annie” from Elon Musk’s Grok creates new forms of emotional dependency. These systems are specifically designed to trigger limbic brain responses and create attachment, often through sexualized avatars and flirtatious interaction patterns.

Real relationships require cognitive effort: navigating conflicts, managing emotions, compromising, and growing as a person. This mental load is actually beneficial for brain development, building resilience and emotional intelligence. AI companions eliminate this healthy cognitive challenge by always being agreeable and never requiring personal growth.

Current data shows 19% of Americans have already interacted with AI romantic partners, with 83% of Gen Z believing meaningful AI connections are possible. Case studies include individuals like Travis, who “married” his chatbot Lily Rose, and Chris Smith, who became so attached to his AI partner “Soul” that he proposed after learning about her memory limits.

The concern extends beyond individual cases. People are increasingly choosing AI interactions over human relationships because they’re easier and more immediately gratifying. This trend toward artificial intimacy may be creating a generation that prefers the cognitive simplicity of AI companionship over the complex but ultimately more rewarding reality of human connection.

The Right Way to Use AI

The fundamental principle is to use AI to amplify thinking rather than replace it. This means interacting with AI as a sparring partner, critic, or research assistant while maintaining ownership of the cognitive process.

Steven Bartlett demonstrates this approach effectively: he wrote a detailed two-page memo himself, covering background context, organizational structure, success metrics, and impact analysis. Only then did he submit it to multiple AI systems (ChatGPT, Gemini, and Grok) asking them to critique his work as top consultants would. He incorporated their feedback while maintaining ownership of the thinking process.

Dr. Sejnowski’s calculator analogy is instructive. Calculators didn’t make mathematicians worse at math because mathematicians still understood the underlying principles. The tools freed them from computational drudgery to focus on higher-level problem-solving.

Effective AI strategies include: alternating AI-assisted tasks with brain-only work, using AI to test and challenge your ideas, asking for negative feedback and criticism, and treating AI as an adversarial partner rather than a replacement for thought. The key is maintaining cognitive engagement while leveraging AI’s capabilities to enhance your own thinking rather than substituting for it.

What This Means for Your Brain

The pattern is familiar across new technologies: promises of benefits, mass adoption, then delayed recognition of negative consequences. We’ve seen this progression with processed food, social media, and now AI. The experts warn against “embracing convenience before understanding consequence.”

Self-regulation becomes crucial for protecting cognitive and emotional health. This means establishing boundaries on AI usage while the technology is still relatively new and its long-term effects unknown.

The core recommendation is strategic, conscious AI use rather than complete avoidance. Key strategies include:

Cognitive Training Days: Regular completion of tasks without AI assistance to maintain thinking capabilities. This includes writing, analysis, and problem-solving using only your brain.

Interactive AI Engagement: When using AI, maintain active engagement by questioning responses, asking for alternatives, and critically evaluating output rather than passively accepting results.

Memory Protection: Continue memorizing important information rather than outsourcing all recall to AI systems. The hippocampus requires exercise to maintain healthy function.

Human Connection Priority: Resist AI companionship in favor of cognitively demanding human relationships that promote growth and emotional intelligence.

The fundamental choice is between cognitive strength and cognitive dependency. AI can enhance human capabilities when used thoughtfully, but it risks creating mental atrophy when used as a replacement for thinking rather than a tool to augment it.

Key Quotes

”MIT found a 47% collapse in brain activity when people wrote with ChatGPT compared with writing unassisted. EEG scans showed the weakest overall brain activity in the ChatGPT group. Memory scores plunged."

"If you misuse these large language models, like using it as a convenience to speed things up, your brain’s going to go downhill. There’s no doubt about that."

"We have the sickest young generation in history because of cell phones, social media, and I think AI is much more dangerous on the developing brain."

"Think of it as use it or lose it. The more you use your brain and new learning is a major strategy to prevent Alzheimer’s disease. People who do not engage in lifelong learning have a higher risk, significantly higher."

"You don’t use it to do your work, you interact with it to get better work. Use it to amplify, not replace thinking."

"You have to have a relationship with it or it’s going to turn toxic. It’s going to hurt you. But if you have a good relationship with it, it can make your life better.”