If you or someone you care about is in crisis, text or call 988 or explore other crisis services for 24/7 help
AI Therapy for Teens: What Is It and Is It Safe?
AUTHOR
Chrissy Holm
Writer, Project Healthy Minds
CLINICAL REVIEWER
Dr. Monica P. Band, LPC
Licensed Psychotherapist
AUTHOR
Chrissy Holm
Writer, Project Healthy Minds
CLINICAL REVIEWER
Dr. Monica P. Band, LPC
Licensed Psychotherapist
Nov 12
Est Reading Time: 13 mins
Key Points
AI therapy can help with everyday stress and mild anxiety
Use it as a tool in your mental health toolkit, not the only one
It’s NOT a replacement for real therapists, especially in crisis situations
Privacy matters: not all apps are safe, so check if they’re HIPAA compliant
Your mind is spinning again—you’re anxious about tomorrow’s presentation, test, or big day. You keep replaying an argument with your friend or a difficult situation, and everything feels heavy. You want to talk to someone, but it’s late, you don’t have anyone available, or you just…can’t.
You’ve probably heard about AI therapy apps or people using ChatGPT for mental health support. Maybe you’re wondering if they actually work or if they’re safe to use. With so many options out there, it’s hard to know what’s legit and what’s not. Let’s break down what these apps can and can’t do, and how to use them safely if you decide to try one.
AI therapy apps use technology to give you mental health support and coping strategies 24/7. Nearly 1 in 5 teens and young people experience mental health challenges, and AI therapy was designed to bridge the gap when traditional therapy isn’t accessible.
What AI therapy can do:
Guide you through calming exercises when anxiety hits
Help you spot patterns in your mood
Teach you coping strategies based on proven therapy methods (like cognitive behavioral therapy)
Provide a private space where you can be honest and express yourself
What AI therapy can’t do:
Replace human therapists or counselors
Handle emergencies or crisis situations
Feel emotions or truly understand what you’re going through
Help with severe mental health issues or trauma
AI doesn’t actually understand you—it’s just really good at predicting what sounds helpful based on patterns. It’s like when Netflix recommends shows, TikTok suggests videos on your For You page, or Spotify queues up songs. It’s guessing based on data, not because it knows your taste or how you’re feeling today.
Research shows AI tools can help improve depression and anxiety symptoms in the short term. One study found that people who used an AI therapy app called Woebot for two weeks saw their depression symptoms improve. But other research found that AI chatbots sometimes give unhelpful advice, miss when someone is in a crisis, or respond in inappropriate ways.
As Amanda Ferrara, licensed marriage and family therapist, explains, “for teens hesitant to speak to a human therapist, these tools can serve as a gentle entry point into understanding their emotions and mental health needs.”
AI therapy can be helpful in day-to-day stress or mild anxiety, but it works best as part of your support system, not the whole thing. Ferrara also says to seek more help when you’re experiencing “persistent or worsening symptoms, including ongoing sadness, anxiety, hopelessness, irritability, sleep or appetite changes, or difficulty functioning at school or socially.”
Talking to AI can feel like venting to someone who gets it, but it’s not a therapist. Here’s what’s happening behind the scenes:
AI seems to understand, but it’s predicting responses based on patterns, not actually feeling anything or understanding your unique situation. There’s no real empathy there. Psychiatrist Steven Reidbord, MD, puts it this way: “AI tools are self-help. Like self-help books and non-AI apps, they can only go so far.” At the end of the day, AI is a tool, not the main character.
AI can’t pick up on sarcasm and tone, cultural context, or subtle emotional cues the way humans do. As a licensed clinical social worker, Madeline Weinfeld Shill points out, "Cultural identity and background can play a huge role in the therapeutic relationship in a similar way to how it can shape any interpersonal relationship." AI can’t understand how your cultural background, religious beliefs, or community values shape your mental health needs and healing practices—it can miss these crucial details.
It can get even trickier if you haven’t thought much about how your cultural background shapes who you are (or if you’re still actively figuring it out), you might not even realize when AI advice doesn’t quite fit. AI often overlooks cultural differences, so you could be getting recommendations that feel off or even reinforce stereotypes without understanding why it doesn’t feel right for you.
AI can show bias and stigma toward certain mental health conditions. Researchers at Stanford University tested different chatbots and discovered that AI was more judgmental when talking about conditions like alcohol dependence and schizophrenia compared to depression. This bias can be harmful and might discourage people from getting the help they need. The research found this pattern across multiple AI models—not just one.
Depression Quiz
Free & Private
Takes 1 minute
Take our fast, evidence-backed self-assessment to understand your depression symptom severity and find options for getting help.
Not all AI platforms protect your data the same way. Some save what you type, use it to train their systems, or even share it with other companies. Licensed clinical social worker Christina P. Kantzavelos explains, “Teens should use platforms with encryption, anonymized data storage, and no third-party sharing. If an app is free, it often means their data is the product.”
These privacy issues can impact some people more than others. If you can’t afford paid apps, your personal information might be less protected. Conversations about your mental health, your identity, and your struggles could be stored or shared without you knowing where it goes.
Privacy matters even more in certain situations. If you’re LGBTQIA+ and not out, concerned about immigration status, from a racially marginalized community that already faces more surveillance, living in a small town, or dealing with an unsafe home, understanding how your data is protected is especially important. The teens who need support most often end up with the least privacy protection. That’s why knowing what to look for in an AI tool matters.
A lot of people already use ChatGPT to vent or talk through their feelings. And we get it—it’s free, it’s always there, and sometimes typing everything out just helps.
But even OpenAI (the company that makes ChatGPT) has said that it’s not built for mental health support: “our systems can fall short,” and they are “continuously improving how our models respond in sensitive interactions.” Unlike HIPAA-compliant apps, everything you type gets saved and used to train their AI. That means it’s not private like a real therapy session would be.
ChatGPT can be helpful for:
Getting journaling prompts
Understanding mental health terms like anxiety
Brainstorming ways to deal with stress
ChatGPT isn’t helpful for:
Mental health support or therapy
Crisis situations
Trying to diagnose yourself
If you’re dealing with something serious, talk to a professional or use an app specifically designed for mental health with privacy protection (more on that below).
Before you download any mental health app, make sure it’s both safe and based on real science. As Ferrara recommends, “checking if it references licensed psychologists or mental health professionals in its development, cites clinical research or published outcomes, and has transparent information about data use, limitations, and crisis support.”
You can also check if:
It’s HIPAA compliant. This is a law that protects your medical information—look for it in the app description or privacy policy.
Real users have left positive reviews. Check app stores and Reddit, not just the company website.
You can delete your data whenever. There should be a clear way to delete your account and all your information.
The privacy policy explains what data they collect and who sees it. If it’s full of confusing legal language or doesn’t say, that’s a red flag.
They explain what happens in a crisis. Will they contact emergency services? Do they direct you to hotlines?
If it’s accessible for you. Check if it supports screen readers, offers multiple ways to communicate (text, audio), and works in your preferred language.
An important note from Camille Tenerife, licensed marriage and family therapist, “It is important to read the fine print to understand privacy rights, and although this might be harder to understand, you can always ask an adult to help with understanding privacy and data safety.”
Anxiety Quiz
Free & Private
Takes 1 minute
Take our fast, evidence-backed self-assessment to understand your anxiety symptom severity and find options for getting help.
Scenario 1:
The app says, “We may share your data with third-party partners for marketing purposes.”
Answer: RED flag. Your mental health conversations should NOT be used to sell you stuff.
Scenario 2:
The app states it’s HIPAA-compliant, uses encryption to protect your messages, and has a simple “delete account” button.
Answer: GREEN flag. Encryption keeps your data secure, and easy account deletion means you have control.
Scenario 3:
The app is totally free, has no ads, and is vague about how they make money.
Answer: RED flag. If you’re not paying and there are no ads, your data might be the product.
Scenario 4:
The app’s privacy policy is written in plain language, explains what data it collects, and says it’ll never sell your information.
Answer: GREEN flag. Transparency matters. Clear communication about privacy shows the app is trustworthy.
Scenario 5:
The app asks for your full name, school address, and parents’ or guardians’ contact information before you can use it.
Answer: RED flag. Mental health apps should never require this much personal information up front, especially addresses, as this creates safety risks.
Privacy policies are often filled with confusing legal words. But don’t just skip reading them! You can use AI tools like ChatGPT, Gemini, Claude, or other chatbots to translate them into a language that actually makes sense. Here’s how:
Find the app’s privacy policy (it’s usually at the bottom of their website, in the app store description, or in the app’s settings)
Copy the link to the privacy policy
Paste it into an AI chatbot and ask something like: “Can you explain this privacy policy in simple terms? Does this app share or sell my data? Are my conversations private? Can I delete my information if I want to?”
Note: If AI can’t open the link, just copy and paste the privacy policy text (or at least the important parts about data collection and sharing) directly into the chat
Key questions to ask about:
Can they share your data with other companies?
Do they use your conversations to train AI?
Do they have end-to-end encryption (this means only you can read your messages)?
Can you delete your account and all your data if you change your mind?
If the AI finds anything sketchy (like the red flags from the above) or confusing in the privacy policy, talk to a parent, guardian, or trusted adult before sharing any personal information with the app. It’s always better to be safe!
If you’re in immediate danger, having thoughts of hurting yourself or others, or in a crisis, reach out to real people who can help.
988 Suicide & Crisis Lifeline: Call or text 988 (TTY users: press 711 after calling)
Crisis Text Line: Text HOME to 741741
Trevor Project (LGBTQIA+ youth): 1-866-488-7386 or text START to 678678
National Runaway Safeline: 1-800-786-2929
Note: The crisis lines above are for immediate help. Some helplines aren’t 24/7 but can still provide support and connect you to resources when you need someone to talk to.
Some AI companies are working on building better crisis features, like tools that could help you contact a trusted person or alert someone in your life that you’re in danger, but these aren’t ready yet. This is why having real people and services (like the ones listed above) is so important when things get serious. Never rely on AI alone in a crisis.
AI therapy can work for everyday stress and managing mild anxiety between therapy sessions. It’s an option for supporting traditional therapy that isn’t accessible right now due to the cost of therapy, wait times, distance, or transportation.
Here are some signs that talking to a professional might help:
You’re having thoughts of self-harm, suicide, or hurting others
You’re dealing with severe depression, anxiety, or trauma
You feel unsafe at home, school, or in your living situation
You’ve been overwhelmed for weeks and nothing you try is helping
You’re experiencing persistent sadness, withdrawing from friends, or things that used to matter don’t matter anymore
You’re finding it difficult to function in daily activities at school, your job, or socially
If things feel serious, reaching out to a professional sooner rather than later can make a real difference.
If you decide to try AI therapy, here are some things to consider:
Guard your identity. Don’t share identifying details unless you’ve verified the platform is HIPAA-compliant and trustworthy.
Question everything. Just because the AI sounds confident doesn’t mean it’s correct. If the advice doesn’t feel right, trust that feeling.
Consider other support. Use AI support alongside real-life conversations with your friends, family, or professionals, even when reaching out feels hard.
Check in with yourself. Pay attention to how using the app makes you feel. If you end up feeling worse, more anxious, or confused, it might be time to try something else.
You’re the first generation growing up with AI as a regular part of life. AI therapy is still new and evolving—what’s true today might change, so stay informed and keep checking in with yourself about what works.
AI therapy can teach you coping strategies and be there when you need support at 2 AM. But it can’t replace human connection, handle emergencies, or address complex mental health challenges.
Asking for help in any form (AI therapy, talking to a school counselor, texting a crisis line, or opening up to a friend or an adult you trust) is a strength and shows you’re taking your mental health seriously. It helps to have a bunch of different tools and people you can turn to. You don’t have to figure this out alone or just with a chatbot.
Project Healthy Minds has resources to help, including a directory for therapy options, digital self-care tools, and support that actually works for you. Check it out and find what fits your life.
Browse Services
The Healthy Minds Blog
Real talk on mental health, from real people and mental health experts who get it
Nov 6
Est Reading Time: 9 mins
Feeling forgetful and unfocused? You might be dealing with brain fog. Read on to learn the signs and causes of brain fog – and how to cope with it.
Read More
Nov 5
Est Reading Time: 15 mins
Journaling for mental health is more than a trend—it’s a powerful tool all can use to support anxiety, depression, grief, and more. Explore how it works and try simple prompts to get started.
Read More
This content is for informational purposes only and is not a substitute for professional advice, diagnosis, or treatment. If you're in crisis, visit our Crisis Services page or call 988 for the Suicide & Crisis Lifeline. Call 911 if you're in immediate danger.
Project Healthy Minds does not endorse any specific services or providers.
© 2025 Project Healthy Minds Corporation. All rights reserved. Project Healthy Minds Corporation is a qualified 501(c)(3) tax-exempt organization. Tax ID Number: 82-3616589. By accessing or using this website, you agree to the terms, conditions, statements, disclosures and policies in our Terms of Use and Privacy Policy.
Loading Amplitude Guides and Surveys...






