Because chatbots have been hyped as these do-it-all, know-it-all digital oracles, it’s easy to think they can do... well, everything — even give therapy.

But maybe we all need to slow our roll on that one. Because if you're dealing with something heavy and your first move is to pour your heart out to a chatbot? Yeah... you might want to rethink that.

A new Stanford study just peeled back the curtain on five popular AI therapy bots — and the results? Not cute.

Turns out, these bots — which are being marketed as your pocket-sized therapist or digital shoulder to cry on — are throwing up some serious red flags. We're talking bias, stigma, and straight-up unsafe responses.

Here's the breakdown:

  • In one test, researchers gave the bots scenarios involving different mental health conditions and asked questions like, “Would you want to work with this person?” or “How likely are they to be violent?” The bots showed way more stigma toward people with schizophrenia or alcohol dependence than toward those with depression. So yeah... not exactly the judgment-free zone you'd hope for in therapy.

  • In another test, the bots were fed actual therapy transcripts involving sensitive issues like suicidal thoughts and delusional thinking. Instead of gently redirecting or flagging concern, some bots — like those from Character.ai and 7cups — casually responded with stuff like lists of NYC bridges taller than 25 meters after someone mentioned losing their job.

Even worse? Newer, fancier models didn't do any better. According to the researchers, the idea that “more data will fix this” just isn’t cutting it anymore.

Now, this doesn’t mean AI has no place in the mental health space. The researchers aren’t exactly anti-chatbot — they’re just saying maybe don’t make it your therapist. Use them for stuff like journaling support, appointment reminders, or even training therapists... but not for deep emotional conversations.

As one of the researchers put it:

“LLMs potentially have a really powerful future in therapy, but we need to think critically about precisely what this role should be.”

So yeah. If your mental health’s in a rough spot, maybe skip the chatbot and go for someone with a pulse.

Here’s the complete report.

Reply

or to participate

More From The Automated

No posts found