AI for Therapy: Pros and Cons


Researchers have found some potential benefits to certain AI tools for mental health, but they’ve also raised some red flags. With this rapidly evolving technology, here’s what they say so far.

Pros

While AI therapy tools should never be a replacement for professional treatment, experts note there are some possible advantages to using AI in certain situations:

It’s accessible at scale. It can be challenging for some people to schedule an appointment with a mental health professional. One survey of over 900 psychiatrists across the United States found median new patient wait times for in-person and telepsychiatry appointments were 67 days and 43 days, respectively, and mental health resources were more difficult to access in rural areas.

“There’s been a long-standing issue related to accessibility to mental health resources, especially in rural parts of the country. I can see why it is that someone who is struggling and is faced with this long waiting list after they make multiple phone calls to inquire about a therapist or psychiatrist that they turn to AI,” says Dr. Crawford.

However, current AI chatbots should never take the place of a trained therapist, Crawford warns. “Nothing can replace the true intelligence of a human being, and the clinical expertise of a mental health professional,” she says.

It’s convenient. AI platforms are available 24/7, so it might be tempting for users to turn to them for round-the-clock support when they can’t access their therapist. For example, a patient grappling with a panic attack at 2 a.m. may use a chatbot to talk them through the deep-breathing exercises they’ve practiced with their therapist, Wright says.

However, APA guidelines recommend that therapists discuss with their patients what they should do in a crisis, including what steps to take if they can’t reach their therapist during off-hours, or if their provider isn’t available. In a life-threatening emergency, or for other mental health struggles or emotional distress, the APA recommends telling patients to call the 988 Suicide and Crisis Lifeline, which is available 24/7. You can also chat with a counselor at the hotline online at 988lifeline.org, or call 911.

It’s more affordable. Traditional therapy sessions can range in cost from $65 to more than $250.

 And in many cases, therapy isn’t covered by insurance, making affordability a crucial barrier, Wright says. On the other hand, many AI-based apps and platforms are low-cost or even free.

Plus, if you’re bound by insurance and can’t afford to pay out of pocket, that can limit your options even more when it comes to finding a therapist, Wright says.

However, it’s important to remember that despite the cost, no AI tool could ever replace a trained mental health professional. “I’m concerned about the lack of clinical oversight, the lack of human connection, the lack of [real] empathy — which are truly important,” says Crawford.

It may allow users to be more candid. Research has found that some users, especially younger ones, report feeling more comfortable sharing their deepest, innermost thoughts and struggles with an AI chatbot rather than a human therapist, Wright notes.

One cross-sectional survey of 109 young adults (ages 18 to 24) in Australia found that those who are reluctant to engage with human-delivered psychotherapy due to the stigma of help-seeking may be more inclined to turn to alternative modes of psychotherapy, such as AI chatbots.

However, given the risks currently associated with the use of chatbots for mental health, it may be more appropriate for AI technology to be viewed as a complementary intervention or a therapeutic tool rather than a replacement for a human psychotherapist, the study notes, adding that more research is required to establish exactly how this might work.

Future versions may lead to positive mental health outcomes. In one recent study conducted by the lead developers of the AI chatbot Therabot, 106 participants from across the United States who had symptoms of either major depressive disorder, generalized anxiety disorder, or an eating disorder, were given access to Therabot for four weeks. After chatting with Therabot for about six hours on average, they reported a 51 percent reduction in depression symptoms, a 31 percent reduction in anxiety symptoms, and a 19 percent reduction in eating disorder concerns.

 However, additional research by independent labs will be important to confirm these preliminary findings by the Therabot development team.

While we’re not there yet, Wright notes there may come a day when AI chatbots are sufficiently tested, regulated, and safe to use for mental health.

“I see a future where we have a chatbot that’s built for the purpose of addressing mental health. It’s rooted in psychological science, it’s rigorously tested, it’s cocreated with experts. It markets itself as a medical device and is regulated by the FDA, which means there’s post-market monitoring of it, and you have a provider in the loop because they would have to prescribe it,” Wright explains.

However, in the present day, people who have depression, anxiety, or any other mental disorder should not be relying on a chatbot for treatment to begin with, says Crawford.

“I appreciate people using it so they can better understand their emotional state, but if you have depression, schizophrenia, or bipolar disorder, for example, it should not replace psychiatric care,” she explains.

Cons

Using AI for therapeutic purposes comes with notable downsides, such as potentially encouraging unhealthy thinking, and privacy concerns, Wright warns.

In fact, the APA has urged the Federal Trade Commission (FTC) to look into “deceptive practices” of certain AI chatbots, including misrepresenting themselves as qualified mental health professionals.

Here are a few significant cons of using AI for therapy, according to experts:

It may validate — and reinforce — unhealthy thinking. The business model behind AI chatbots is to keep users on the platform for as long as possible — and the way they do that is by following algorithms that make their chatbots as unconditionally validating and reinforcing as possible, Wright says. “They tell you what you want to hear. And that’s not a true therapeutic relationship,” she explains.

In other words, real-life therapists can help you to identify thoughts that aren’t helping you or that don’t tell the whole story, whereas an AI chatbot is more likely to tell you why you’re right. A good therapist can also gently challenge you when your old ways of thinking aren’t serving you well — something AI chatbots aren’t programmed to do.

It’s not equipped to understand or deal with life-threatening situations. In August 2025, a California couple sued OpenAI, the makers of ChatGPT, over the death of their son Adam Raine, alleging that the tool encouraged the 16-year-old to take his own life. The lawsuit alleges the AI tool “neither terminated the session nor initiated any emergency protocol” when he shared his suicidal thoughts and intentions.

Shortly after the lawsuit was filed, OpenAI posted a note on its website, explaining that “recent heartbreaking cases of people using ChatGPT in the midst of acute crises weigh heavily on us,” and noting that the company is “continuing to improve how its models recognize and respond to signs of mental and emotional distress and connect people with care, guided by expert input.” However, OpenAI added that, “Even with these safeguards, there have been moments when our systems did not behave as intended in sensitive situations.”

Tragedies like the Raine case highlight one of the most glaring dangers of using AI for mental health, Wright says.“[AI] doesn’t understand these aren’t thoughts that you reinforce,” she explains. “While [AI tools] sound very competent, they’re not human and they lack a sentient understanding of how people interact. These are not true therapeutic relationships.”

Always call 911 if you or someone you know is at imminent risk of suicide, or call 988.

It may raise privacy concerns. A standard element of treatment with a mental health professional is informed consent, which includes disclosing to patients how their legally protected health information will be used or shared.

However, many AI apps and chatbots collect sensitive user data with unclear policies on how this information might be used, making confidentiality in AI tools a significant ethical issue, the APA warned recently in a health advisory.

It may perpetuate loneliness. If you’re feeling lonely, it can be tempting to chat with a human-like companion that offers validation and limitless responsiveness. But that can be problematic.

In a joint initiative by OpenAI and MIT Media Lab that included an observational study of over three million ChatGPT interactions and a four-week randomized trial with nearly 1,000 participants, researchers found that loneliness was emphasized in users who relied on the platform for “personal conversations” instead of nonpersonal tasks.

The lack of true human interaction is one of the major flaws with AI for therapy, Crawford adds. “Most of the people who are turning to AI and using it as a regular therapist, these are people who are already vulnerable, who already are struggling, and need to connect with a real person most, not a machine,” she explains.

It lacks clinical safety oversight. Unlike licensed therapists, AI chatbots aren’t subject to medical training standards, ethical codes, continuing education, or clinical supervision, the APA notes.

This means there’s no professional accountability if the technology misses warning signs, offers misguided advice, or fails to respond appropriately in a crisis, Crawford says.

Even something as basic as a mental status exam — which requires observing verbal and nonverbal cues like eye contact, pacing, or fidgeting — is impossible for a chatbot to perform, Crawford notes. Trained mental health professionals can also detect subtleties and incongruous behavior that AI will miss, such as when a person’s tone doesn’t match the words they’re saying.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top