“AI is my therapist.” For some young people, that’s not a joke.
Adolescents and young adults are increasingly using artificial intelligence (AI) to discuss their emotions, stress, and mental health concerns. But when does AI help, and when does it have the potential to cause significant harm?
In this article, psychiatrist Chad Puffer, DO, Medical Director of PrairieCare’s Outpatient Services, breaks down six situations and considers whether AI would be beneficial or dangerous in each case.
What You’ll Learn
- What does “AI is my therapist” mean?
- When can AI help with mental health—and when can it hurt?
- Why do young people turn to AI for support?
- When should you get professional mental health help?
Quick Read
Many teens and young adults are turning to AI chatbots for instant support with stress and difficult emotions. AI can help by encouraging self-expression, offering journaling prompts, and providing basic mental health information. It’s always available, nonjudgmental, and affordable.
But AI is not therapy. It cannot replace professional care because it can’t assess risk, respond to complex symptoms, or offer a genuine human connection. It’s unsafe to rely on AI during crises, for support with severe mental health issues, or for diagnoses.
Young people like AI for its instant responses and privacy, but experts warn that it should only be a supplement—not a substitute—for real therapy. If you or someone you know is struggling with serious symptoms, reach out to a licensed mental health professional or emergency services.
How Many Young People Are Using AI for Mental Health?
People are turning to AI for help and information about almost everything, and that includes mental health concerns. Teens and young adults—so-called “digital natives”—are more likely than older generations to be comfortable using the technology. And they’re also the generations most affected by mental health issues.
A recent survey found that 1 in 8 young people in the United States have used AI for mental health—5.4 million youth. For young adults 18 and older, more than 1 in 5 say they’re turning to AI for support, and nearly two-thirds check in at least once a month.
AI offers something young people crave: answers and reassurance without judgment. A bot feels neutral and nonreactive, making it far less intimidating than a human when it comes to exploring emotions and symptoms.
Plus, AI is always there. When a young person feels anxious or overwhelmed, the bot’s immediate response can feel calming—even if it’s not helpful in the long term.
Why People Like AI ‘Therapy’
More than 90 percent of people who ask AI for mental health advice report that the response they received was helpful.
Dr. Puffer says the benefit they experience comes from the act of self-expression, not from the bot’s answers. Like talking with another person or journaling, AI chats can help people organize their thoughts and gain clarity from pausing and reflecting.
“I’m glad youth are describing the difficulties they’re facing,” said Dr. Puffer. “Just putting these issues into words or text can be helpful.”
But at the end of the day, it’s not the same as therapy. Dr. Puffer emphasizes that it’s great to explore new tools for wellness, but that doesn’t qualify as professional care.
AI vs. Therapy: What’s the Difference?
Real therapy involves clinical training, ethical responsibility, and the ability to adapt care to the patient’s needs and medical history. While AI can sometimes play a supportive role, it’s not going to replace proper mental health treatment any time soon.
Why AI Sometimes Helps
Dr. Puffer notes some areas where AI excels. “It’s always available, it’s encouraging, and it’s relatively inexpensive.” Those qualities can make AI appealing to people seeking support, especially since cost, access, and stigma remain major barriers to care.
However, Dr. Puffer cautions that this convenience does not constitute adequate care. “There are many things AI lacks—things that are important for the effective and safe provision of care,” he explained.
Where ‘AI Therapists’ Fail
Unlike a licensed mental health professional, AI can’t assess risk, analyze nuanced emotional cues, or respond to complex mental health conditions and situations.
A recent study identified several areas where Large Language Models (LLMs), such as ChatGPT, fall short as therapists. LLMs were found to:
- Show bias against people with depression, schizophrenia, alcohol dependence, and mental illness in general
- Fail to provide appropriate responses to people in delusional, suicidal, or hallucinatory states
- Share confidential information
- Encourage or not push back against dangerous or destructive thought patterns
Most crucial, LLMs are unable to provide human connection, which is a critical component of successful therapy.
When You Should Never Use a Chatbot for Mental Health
Healthcare experts caution against relying on artificial intelligence for most conditions, especially when they are severe, ongoing, and complex. According to Dr. Puffer, there are clear limits on what AI can handle. Human support is essential for the following situations:
Feeling Concerned About Your Own or Others’ Safety
“If you are considering hurting yourself or someone else, you should not seek help from AI,” Dr. Puffer warns. “These situations require evaluation and support from a professional.”
During a crisis, you can access immediate mental health help by dialing 988, the Suicide and Crisis Lifeline. You can also call 911 or go to the nearest emergency room.
“People who are trained to keep you and others safe are the right resources in these urgent moments,” said Dr. Puffer. In crises, artificial intelligence can’t help with real treatment and can make the situation worse.
Experiencing Severe Mental Health Symptoms
Individuals who are struggling with complex or severe mental health conditions should never use AI as their primary guide, Dr. Puffer says. These include psychotic disorders involving hallucinations or delusions, bipolar disorder, eating disorders, or trauma with disassociation.

Seeking a Mental Health Diagnosis
AI can offer general information about mental health, but it cannot make a definitive diagnosis or suggest treatment options. “Establishing diagnoses and recommending a treatment plan requires years of training and experience,” Dr. Puffer explains.
While some diagnoses may seem straightforward, licensed professionals can identify subtle clues that may indicate the need for a different treatment approach or another diagnosis altogether.
“AI does not have nearly enough context about you to reliably catch these nuances, so relying on it can be unsafe,” says Dr. Puffer.
When It Can Be Safe to Use AI for Emotional Support
While AI is never a replacement for professional mental healthcare, it can serve as a supplementary tool in some situations. Dr. Puffer emphasizes that risk level and intent are the key distinctions.
Here are three ways that AI can offer support:
Providing Journaling Topics
Journaling and reflection can help people process emotions, and AI can be effective in supporting that, Dr. Puffer says. Chatbots can generate prompts, track mood patterns over time, or offer a space in which to sort through confusing thoughts.
Remember, though, what’s shared with a chatbot may not be private. Using a physical journal or a digital document will keep the information offline.
When used for journaling, AI acts more as a structured mirror than a source of treatment. But an actual therapist can provide journaling ideas that are tailored to your healing process and can be explored during sessions.
Building on Basic Knowledge
According to Dr. Puffer, AI can support basic skill-building and mental health education. “Learning foundational concepts or general coping skills can be effective with AI,” he says.
For example, you could use AI to conduct general research about certain conditions and treatment methods before a therapy session or doctor’s appointment. Understanding what certain disorders entail and what treatments are available can help you advocate for yourself or a loved one.
However, Dr. Puffer cautions that this information is typically broad and not personalized. Plus, not every question can be phrased in a way that AI can understand and respond accurately to. Make a note of any questions you have that go deeper into a topic or aren’t easy to express, so you can ask your therapist.
Working Through Low-Stakes Emotions
For small, everyday emotional processing, AI may offer a low-pressure outlet. “Because of its inherent patience and nonjudgment, venting to AI about a frustrating situation can be helpful,” said Dr. Puffer.
Examples of low-stakes emotions include being annoyed with a friend, feeling embarrassed after an uncomfortable interaction, or noticing that you get irritated at certain times of day. Chatbots can help users analyze conversations that didn’t go well, help you practice communicating more clearly, and suggest ways to shift out of negative thinking.
If you feel overwhelmed or distressed, however, reach out to a therapist or loved one. Otherwise, it’s okay to use a chatbot to help you pause and reflect, as long as you keep its limitations in mind.

When to Get Real, Professional Care for Mental Health
When should you seek support from a human? Here are some signs that indicate it’s time to reach out to a professional:
- Excessive worrying, fear, or anxiety
- Major changes in sleep or appetite
- Feeling emotionally “flat” or numb to normal activities
- Difficult focusing or having extreme mood swings
- Withdrawing from friends and family
- Ongoing and unexplained physical symptoms like stomachaches and body tension
A mental health screening with a person, such as PrairieCare’s complimentary care questionnaire, can help identify what you’re experiencing and provide guidance on next steps, so you or your loved one can start feeling better.
What to Expect in Therapy
Trying therapy for the first time can feel intimidating, and for many people, opening up about your deepest thoughts and fears can feel scary, shameful, or just plain awkward. “You’re not alone or strange for feeling that way,” said Dr. Puffer.
Unlike AI support, therapy is completely confidential. Dr. Puffer reminds patients, especially new to therapy, “It’s entirely up to you what you choose to talk about and whether or not you decide to continue.”
The first several sessions with a therapist are usually a trial period to see if it’s a good fit. The goal in the early stages is to build comfort and trust, so you can tackle deeper issues as the process continues.
Lastly, know that the old stereotypes no longer apply in most cases. “Most therapy doesn’t involve being psychoanalyzed while on a couch,” said Dr. Puffer. Therapy is usually more like a conversation with a supportive expert, who offers practical guidance that can have a long-term impact on your ability to navigate daily life.
Compassionate Mental Healthcare in Minnesota
In conclusion, AI is not replacing the caring and compassionate approach of a mental health professional, now or in the near future. Therapists, psychologists, psychiatrists, and other specialists can make a life-changing difference.
Qualified mental health professionals can offer so much that AI can’t—like using a whole-person approach, developing a comprehensive treatment plan, and modifying care when they notice subtle changes in a patient.
PrairieCare’s network of Minnesota mental health experts provides a full continuum of care for all ages. Our team collaborates to ensure you or your loved one receives tailored treatment that meets your specific needs.
Take the first step by calling our team at 952-826-8475, or use the button below to request a complimentary care questionnaire.
