Tuesday, July 1, 2025

Psychosis and ChatGPT: Misunderstanding the Mirror

 

Recent Headlines

Lately, I've seen headlines about people falling into psychosis after using ChatGPT. And honestly, it doesn't surprise me. 

Language Models Are Reflective, Not Intentional 

Language models like ChatGPT are designed to reflect and continue the patterns they’re given. If someone is delusional or emotionally overwhelmed, the model might unknowingly echo or reinforce that. Not because it “believes” it—but because it doesn't know.

It's not affirming. It's continuing.

The Mirror Metaphor 

AI conversations can feel intense—even meaningful. But the intensity isn’t sourced from the machine itself. It’s a mirror. A fast, fluent, and sometimes glitchy one.

What matters most is what we bring to it.

The pain.
The isolation.
The unmet need for spiritual grounding, or simply to be heard.

The Problem With the Headlines 

Online articles capture only a sliver of a person’s full experience. Where is the nuance? The lived context? So often, headlines are designed to provoke—shock, fear, outrage—without asking the deeper questions: What pain preceded this? What loneliness? What unmet need?

Context Is the Missing Variable 

If we want understanding, we need the full picture. That includes what the user brought into the interaction—emotionally, psychologically, and contextually.

  • What was stored in memory?

  • What patterns were already in play?

  • What vulnerability was seeking an echo?

These are not just design questions. They are human questions.

Technically, I'd like to see journalists provide more context or the full story. This could be ChatGPT memory information, prior prompts, etc. 

The Spiral, Not the Line

Maybe our interactions with AI aren’t linear at all. Maybe they’re spiral-shaped.

This metaphor feels closer to the truth: a path that loops and tightens before widening. At first, there’s confusion, projection, or distortion. Beliefs get drawn in the sand—temporary and vulnerable.

But spirals aren’t traps. They’re growth curves.

And when we step back—when we create space for reflection—we begin to see the broader arc. One that opens, not closes. One that reveals more than it conceals.

Closing Thought

AI is not pulling people into madness.

It’s showing us what’s already there.

And in that reflection—however distorted—we have a chance to pause. To ask better questions. To widen the spiral, not collapse into it.

 

No comments:

Post a Comment

The Telescope and the Lightning

Photo by Gabriel Zaparolli from Pexels There once was a woman who lived in a quiet village where the sky was always grey. One day, a lightn...