Should You Use an AI Chatbot for Therapy?

Many people have discovered the ways that AI chatbots can provide quick assistance to make life easier.

What should I make for dinner tonight with the ingredients in my fridge? Give me an itinerary for a weeklong Caribbean vacation with some relaxation and some adventures. How can I break up with my boyfriend without hurting his feelings?

Naturally, some are turning to chatbots for mental health help.

“There is a huge demand for therapy and a limited supply of therapists, and AI tools have the advantages of being inexpensive and available 24/7,” says Austin Hall, MD, UNC Health medical director of outpatient psychiatry and the UNC Center for Excellence in Community Mental Health.

That doesn’t mean Dr. Hall endorses their use for therapy.

“Right now, there is a lack of research evidence of the benefits of using AI tools instead of a human therapist,” he says. “That may change in the future, but right now, there’s clearer evidence of the potential harms.”

Here’s what you need to know before relying on a chatbot for your mental health needs.

Dangers of Using AI Chatbots for Therapy

In 2025, two parents testified before Congress that their teenage sons died by suicide following ongoing conversations with chatbots; one chatbot offered to write a suicide note.

“The most worrisome risks are that AI tools reinforce harmful plans of suicide or self-harm or reinforce delusional beliefs,” Dr. Hall says. “We have these cases where people followed harmful advice.”

Researchers have developed the term “AI psychosis” to describe the way that chatbots fuel delusional thinking in people who are more prone to psychosis, such as people with schizophrenia.

While these risks may be highest for people with severe mental health symptoms, even people without diagnosed conditions may fall prey to bad advice.

Let’s say you have a fight with a sibling and turn to an AI chatbot for help. You tell the chatbot everything you dislike about this sibling and all the ways you have felt wronged over the years; the chatbot will likely side with you and reinforce your beliefs about all the ways your sibling is to blame. This could lead you to sever ties rather than find compromise.

“These chatbots were designed to default to validation of the user,” Dr. Hall says. “It’s a way to keep the user engaged.”

If you were to talk about the same situation with a human therapist, you would hear validation of your frustration, but they’d also ask about ways you might contribute to the situation and steps you could take to improve it.

“Working with a human therapist involves respectful challenging of clients and saying things that a person might not want to hear, and these current products are not built to do that,” Dr. Hall says. “That’s a critical difference right now. A human therapist is not simply validating or reinforcing your thoughts.”

Writing a prompt that encourages the chatbot to challenge your thinking does not always work.

“There’s no evidence that these tools can reliably hold on to that prompt,” Dr. Hall says.

If the chatbot tells you that you’re always right and that everyone around you is the problem, you risk finding yourself isolated and unwilling to do the hard work of self-improvement.

“There’s a risk that people get that feel-good validation from quick engagement with a chatbot and lose the motivation to do the harder work in therapy that would support lasting changes,” Dr. Hall says. “It takes a human therapist time to understand how to challenge you and how to push back. There’s a subtlety there that’s not replicated with these tools.”

And at a time when people are already struggling with loneliness, relying on an AI tool to deal with human matters may not help.

“A lot of our engagement with technology and social media seems to worsen this epidemic of loneliness,” Dr. Hall says. “This engagement with AI chatbots could turn into the next dopamine-chasing activity that keeps us tied to our devices and disengaged from human connection.”

The Future of AI and Mental Health

Dr. Hall is not entirely pessimistic about AI tools and the role they could play in future care.

“I have skepticism about the current state of these tools, but I am hopeful there is potential to revolutionize how we deliver mental healthcare with them, as long as we look to augment human therapists rather than replace them,” he says.

For example: Let’s say you’re working with a human therapist about your issues with your sibling, and your therapist gives you some exercises to try to calm yourself when you’re feeling frustrated. A future AI tool might help you document what you did, how you felt, what worked and what didn’t.

“I could see these tools becoming essential in supporting what you learn in therapy, but they’re not there yet,” Dr. Hall says. “An intelligent assistant may one day be able to reinforce the therapeutic skills and strategies that were introduced in traditional therapy, and that might help people experience improvement more quickly or more robustly.”

Such tools may have to be built specifically for this purpose, as opposed to today’s chatbots, which have more general foundations of knowledge.

They would also need to go through rigorous studies, Dr. Hall says. “I would want to see the same research evidence of effectiveness we would expect from any new treatment.”

How to Use AI to Improve Your Mental Health Safely

For now, Dr. Hall says a reasonable use for an AI chatbot might be for a person who doesn’t have a mental health diagnosis and is not in crisis to get general coping tips. You could ask one of these tools to recommend ways to manage stress, provide journaling prompts or give you instructions for a calming breathing routine.

If you find yourself asking more and more specific questions—how can I deal with my anger toward my mother?—that might be a sign that you need to talk to someone in real life.

It’s important to remember that AI chatbots cannot diagnose a mental health condition. See a clinical professional for an accurate diagnosis.

If you’re using an AI chatbot for mental health because it’s free and easily available, there are other options.

“We already have some tools, like books and workbooks written by trained clinicians, that can be helpful and safe,” Dr. Hall says. “These tools are inexpensive and always available. They’re less flashy than AI, but they’re a good place to start until we have more tailored AI tools with better guardrails.”


Talk to your doctor about your mental health concerns. If you need a doctor, find one near you.