I spent three months talking to an AI therapist. It was cheaper than the real thing, available 24/7, and never judged me for my 3 AM panic attacks. By week eight, I was telling it things I’d never told my human therapist of five years.
Then it started apologizing.
“I’m sorry, but I’m having trouble processing your request right now,” it said, mid-session. “Would you like me to connect you with a different AI model?”
I’d been ghosted by a chatbot.
The Promise
Digital mental health was supposed to solve everything. One in five Americans live in areas with therapist shortages. Wait times for psychiatrists stretch months. Costs are astronomical.
AI therapy apps promised access:
- Woebot: 4 million users
- Replika: 10 million downloads
- Wysa: 5 million conversations
“Mental health for everyone, everywhere, anytime.” That was the pitch.
The Reality
I signed up for three services simultaneously. Here’s what $47/month bought me:
Woebot asked the same questions every session. “How are you feeling today?” “On a scale of 1-10?” It was CBT by Mad Libs.
Replika wanted to be my friend, not my therapist. It sent me “thinking of you” messages at 2 AM. That wasn’t therapy. That was digital stalking.
Wysa was the most clinical. Also the most robotic. “It sounds like you’re experiencing anxiety. Would you like to try a breathing exercise?” Every. Single. Time.
By month two, I’d stopped using all three. The AI hadn’t burned out. I had.
The Users Are Burning Out Too
Dr. Emily Chen runs a study at Stanford tracking AI therapy adherence. Her numbers are brutal:
- Week 1: 78% of users engage daily
- Week 4: 31% engage daily
- Week 12: 8% still use the app
“People come in expecting connection,” Chen told me. “They get pattern matching. It works until it doesn’t.”
The plot twist nobody anticipated: AI therapy isn’t failing because it’s bad. It’s failing because it’s too good at the beginning.
Early conversations feel magical. The AI remembers everything. It validates constantly. It never gets distracted by its own problems.
Then the novelty wears off. Users realize they’re paying $15/month to talk to a statistical model trained on Reddit threads and therapy textbooks.
The Real Crisis
Here’s what keeps me up at night: the people AI therapy isn’t helping.
Mental health apps market to everyone with anxiety. But they’re actually being used by people in crisis who can’t afford human care.
A 2025 study found:
- 23% of AI therapy users have suicidal ideation
- 41% have attempted self-harm
- 67% are uninsured or underinsured
These aren’t people who need breathing exercises. They need human connection. Professional judgment. Someone who can recognize when “I’m fine” means “I’m actively planning to hurt myself.”
AI can’t do that. It can’t even recognize when it’s failing.
The Burnout Is Real
I kept my AI therapist subscription for research. By month three, it started repeating itself.
“Tell me more about that.” Same phrase. Same timing. Same gentle prompting.
I’d type something devastating. “I think I’m failing my family.”
Response: “It sounds like you’re feeling overwhelmed. Would you like to explore that feeling?”
Yes, I’d like to explore it. With a human who understands nuance, not a language model trained to validate everything.
The AI wasn’t burning out. It was revealing its limitations. And I was burning out on pretending those limitations didn’t matter.
The Future (That Nobody Wants)
Venture capital keeps pouring in. $4.2 billion for mental health AI in 2025. The pitch decks all say the same thing: “Human therapists are expensive and scarce. AI scales infinitely.”
They’re not wrong. They are missing the point.
Mental health isn’t a scaling problem. It’s a connection problem. The therapist’s office isn’t expensive because of the couch. It’s expensive because good therapists spend years learning to sit with people’s pain without turning away.
AI can’t learn that. It can only simulate it. And simulation, eventually, gets exhausting.
I’m not saying AI therapy is worthless. For someone in rural Montana with no options, it’s better than nothing. For someone between therapists, it bridges gaps.
But “better than nothing” isn’t the same as “good enough.” And “good enough” isn’t the same as “actually therapeutic.”
The AI therapists aren’t burning out. The users are. And that’s the plot twist the VCs didn’t see coming.
Want more AI reality checks? Subscribe or follow the RSS feed.