If you or someone you care about is in crisis, text or call 988 or explore other crisis services for 24/7 help
AI Therapy and Teens: A Guide for Parents and Guardians
AUTHOR
Chrissy Holm
Writer, Project Healthy Minds
CLINICAL REVIEWER
Dr. Monica P. Band, LPC
Licensed Psychotherapist
AUTHOR
Chrissy Holm
Writer, Project Healthy Minds
CLINICAL REVIEWER
Dr. Monica P. Band, LPC
Licensed Psychotherapist
Nov 12
Est Reading Time: 16 mins
Key Points
AI therapy can help with everyday stress, but it can't replace professional mental health care.
Many teens use AI because it's available 24/7, feels private, and seems low-pressure.
Open, curious conversations can work better than trying to control or take away teens' devices.
Know when your teen needs professional help: crisis situations, symptoms lasting 2+ weeks, thoughts of self-harm, or trouble with daily activities.
Check privacy before they use any app: Look for HIPAA compliance, research-backed approaches, encrypted messages, and the ability to delete all data.
It's the middle of the night. You wake up and notice your teen's bedroom light is still on. When you check on them, they're on their phone, typing intently. Maybe you've seen headlines about teens using AI for mental health support—or maybe you've discovered your own teen is doing exactly that. Either way, you might be asking: Is this safe? Does AI therapy actually help? Should I be worried?
You're not alone in asking these questions. According to a recent report, 72% of U.S. teens have used AI companions at least once. About one-third have discussed serious matters with AI instead of the people in their lives. Whether you just found out your teen is using these tools or you're researching out of concern, here's what you need to know.
AI therapy provides mental health support through apps and digital platforms. These tools offer coping techniques, mood tracking, guided exercises, and therapeutic conversations—all available 24/7.
These platforms use natural language processing—technology that teaches computers to understand everyday conversation and respond appropriately. Many are built on established therapeutic frameworks like cognitive behavioral therapy (CBT), which draws from decades of clinical research.
Provide a low-pressure way to start exploring feelings
Offer immediate access when professional help isn't available
Support skill-building practice between therapy sessions
Serve as a starting point for teens who aren't ready for traditional therapy
Help track mood patterns to identify triggers
Replace a trained therapist who can comprehensively assess your teen and provide psychological evaluation
Navigate the complexity of family dynamics, cultural context, or personal history
Provide crisis intervention
Build the therapeutic relationship that's essential for meaningful change
Teens are already using ChatGPT, Gemini, Claude, and other general AI chatbots for mental health support—often without anyone knowing. These aren’t designed as therapy apps, but they’re free, accessible, and feel private.
The problem? General AI tools don’t have the safety features needed for mental health support. They don’t have crisis detection protocols, therapeutic training, or professional oversight. They’re also not HIPAA compliant, which means conversations may be stored and used to train the AI.
There’s also the risk of losing your chat history without warning. This can mess with your experience and erase the tone, flow, and history of conversations. For people who have formed emotional connections with AI chatbots, losing past conversations can feel like a sudden loss that can cause real distress.
The upside? These tools can help with journaling prompts, learning basic mental health concepts, or coming up with everyday coping strategies. But they become risky when used for crisis situations, self-diagnosis, replacing therapy, or making mental health decisions.
If you find out your teen is using an AI chatbot for emotional support, see it as a chance to connect. Ask what they've found helpful about it. Use the conversation to figure out what they actually need. Maybe it's a mental health app with proper safeguards, traditional therapy, or other kinds of support.
If your teen’s school has counseling resources, it might be helpful to check in with their school counselor. You can ask how the school is approaching AI tools, whether that’s in counseling sessions, classrooms, or digital safety lessons, and how they’re supporting your teen to use AI responsibly.
The evidence on AI therapy effectiveness is mixed.
Some studies show promise. Research found that Woebot, an AI chatbot using CBT techniques, helped college students reduce depression symptoms after two weeks. Studies show AI-based mental health tools can reduce symptoms of mild to moderate depression and anxiety. Other platforms have shown promise for managing stress and social anxiety.
However, a 2023 analysis found wildly inconsistent results. Some responses aligned with evidence-based practices. Others were ineffective or potentially harmful. Some systems completely missed crisis language that any trained therapist would immediately recognize.
The bottom line? AI therapy can be genuinely helpful for everyday stress or as a bridge to professional care. But it's not a substitute for human judgment and connection—like picking up on the change in your teen's voice that tells you something is wrong. This is especially true when your teen is navigating more than temporary stress or mild symptoms.
Globally, over 14% of people aged 10-19 experience mental health conditions. Many go unnoticed or untreated because of real barriers:
Stigma around mental health
Long waitlists for therapists or lack of adequate mental health resources
High costs (therapy can cost $100–$300+ per session, even with insurance)
Limited school counseling (on average, schools have just 1 counselor for every 250 students)
Family situations where discussing mental health feels unsafe
Language barriers
AI therapy has emerged to fill this gap. According to the Common Sense Media report, teens turn to AI companions for various reasons—some use them for entertainment and curiosity, while others seek emotional support, advice, or someone who’s “always available.”
Many teens want to see if their feelings are “serious enough” to get help. AI feels lower-stakes than asking you to find them a therapist. They might be thinking: Is what I’m feeling normal? Do I really need therapy or am I overreacting?
You can help by: Making mental health conversations normal at home—not just during emergencies. Talk about feelings, stress, and coping strategies regularly.
Teens sometimes worry that therapists will share information with parents or guardians. AI feels anonymous and private. They can explore their thoughts without fear of judgment or consequences.
You can help by: Acknowledging their need for privacy while discussing which situations require adult involvement for safety. Explain confidentiality in therapy—therapists only involve parents when there’s a safety concern.
As certified mental health counselor Torrey Harmon explains, “Today’s teens know that they can get information themselves through the internet and through their peers, so they often shy away from asking for help from parents and other trustworthy adults. AI can be a helpful first step in providing help to teens who are struggling by helping them to understand a little more about what they are facing and how to get real help.”
Some teens don’t want to cause worry or add to family stress. Others might not feel safe discussing mental health at home due to stigma or past reactions. Talking to AI feels like handling it themselves without burdening anyone.
You can help by: Reassuring them directly that seeking help isn’t a burden—it’s something you want to support. Remind them that mental health is as important as physical health, and that you want to know what’s going on in their lives.
It may feel obvious to you that your child is never a burden, but many teens hesitate to open up because they worry about disappointing you or adding to your stress, especially if things at home already feel tense or overwhelming. They need to hear out loud that their feelings matter.
When anxiety strikes at midnight or distress hits before school, they can’t wait for a Thursday 4 PM appointment. AI is there now. The 24/7 availability feels like a lifeline when they’re struggling.
You can help by: Acknowledging that professional help isn’t always available 24/7, and discussing when AI is appropriate versus when to contact a crisis line or reach out to you.
Licensed professional counselor Gen Morley adds, “AI's real power is its accessibility. It’s available 24 hours a day, anywhere in the world for little to no cost. In situations that a person has the ability to safely and consistently access a live therapist that is superior to an AI mental health plan. But the vast majority of people on this planet and certainly minors, do not have that access. This is where AI can be life saving. The difference between darkness and even one candle is profound.”
Keep in mind that using AI tools requires regular access to a phone, laptop, or reliable internet—which not every household has. When technology access is limited, your role as a parent becomes even more important. Open, supportive conversations can bridge the gap and provide the connection your teen needs.
Anxiety Quiz
Free & Private
Takes 1 minute
Take our fast, evidence-backed self-assessment to understand your anxiety symptom severity and find options for getting help.
When your teen opens up to AI about their concerns, that conversation is often being recorded and stored. The question is: who has access to it?
Not all AI therapy platforms operate under the same privacy protections as traditional healthcare. While HIPAA applies to some digital health tools, many AI chatbots fall into regulatory gray areas.
Critical questions to ask about any platform:
Is it HIPAA compliant?
What data do they collect, and how is it stored?
Do they share or sell data to third parties?
Can your teen permanently delete their conversation history?
What happens if the company is sold or goes out of business?
Some platforms use conversations to train their AI models, meaning your teen's personal struggles could become part of the data that teaches the system. Others allow data sharing with advertisers or researchers.
Privacy policies can be long and confusing. Here’s a tip: you can copy the privacy policy text and paste it into ChatGPT or other general AI chatbots with this prompt or something similar:
Please review this privacy policy for a mental health app. Tell me in plain language: 1) Is it HIPAA compliant? 2) What data do they collect? 3) Do they share or sell data? 4) Can users delete their data? 5) What are the biggest privacy concerns I should know about?
This can help you quickly understand the key issues. However, remember that this review is just a starting point—when in doubt, look for apps with clear privacy protections. If a platform is vague about how they make money or what they do with their user data, that’s a red flag.
AI tools may help with everyday stress, but they cannot address serious mental health concerns. Your teen needs professional help if you notice:
Any mention of suicide, self-harm, or harming others
Giving away possessions or saying goodbye
Sudden calmness after extreme distress
Reckless or self-destructive behavior
The Jed Foundation’s Mental Health Resource Center focuses specifically on preventing youth suicide and offers helpful guidance.
Withdrawal from activities, friends, or family lasting more than two weeks
Significant changes in sleep, eating patterns, or energy
Declining grades or school avoidance
Increased substance use
Frequent panic attacks or overwhelming anxiety
Depression symptoms that make daily activities harder
Depression Quiz
Free & Private
Takes 1 minute
Take our fast, evidence-backed self-assessment to understand your depression symptom severity and find options for getting help.
Processing trauma from abuse, violence, or significant loss
Repetitive behaviors that seem to cause distress or take up significant time (like excessive handwashing, checking, or counting)
Extreme mood swings or experience or experiences that seem disconnected from reality (like hearing voices or seeing things others don’t)
Complex family situations affecting your teen’s wellbeing
Questions around identity, including LGBTQIA+ exploration, experiences of racism or discrimination, or cultural identity conflicts
Harmon says, “Specific warning signs that indicate that human professional help is necessary include thoughts about suicide, self-harm, or hopelessness; an abusive or unsafe situation; being out-of-touch with reality (e.g., hearing voices, seeing things not seen by others); big changes in eating, sleeping, or other functioning or interests (e.g., no longer enjoying activities that are normally enjoyable); feeling trapped, empty, or worthless; using substances to numb or escape emotional pain; distress is getting worse even with efforts to cope.”
If you're seeing these warning signs, AI therapy is not an appropriate intervention. Your teen needs evaluation by a licensed mental health professional.
Keep these numbers accessible—in your phone, on your refrigerator, and share with your teen:
988 Suicide & Crisis Lifeline: Call or text 988
Crisis Text Line: Text HOME to 741741
Trevor Project (LGBTQIA+ youth): 1-866-488-7386 or text START to 678678
Your local emergency room for immediate psychiatric emergencies
These are staffed by trained crisis counselors who can assess danger, provide immediate support, and connect your teen to appropriate resources. Families can also create a safety plan together. This template from the National Suicide Prevention Lifeline can help guide that conversation.
Before your teen downloads any mental health app, go through these questions together:
Professional oversight
Was it developed with input from licensed mental health professionals?
Does it clearly state what therapeutic approaches it uses (CBT, DBT, mindfulness)?
Are licensed clinicians involved in ongoing oversight and quality control?
Privacy and security
Is it HIPAA compliant, or does it have equivalent privacy protections?
Does it use end-to-end encryption for all conversations?
Can your teen permanently delete their data?
Is the privacy policy written in clear language?
How does the company make money (subscriptions, data, ads)?
Security features
Does it have automatic crisis detection and immediate referral protocols?
Are there clear disclaimers that it's not a replacement for professional care?
Is the content age-appropriate for teens?
What happens if your teen expresses thoughts of self-harm?
Red flags that should make you pause
Promises to "cure" or "fix" mental health conditions
Requests unnecessary personal information (school name, address, etc.)
Vague, confusing, or concerning privacy policy
No clear crisis protocols
Completely free with no clear explanation of their business model
Licensed clinical social worker Amanda Stemen shares what to look for: “Age-appropriate safety features, parental controls and consent requirements for minors, transparent data practices, co-designed by mental health professionals, and a lack of manipulative features, such as simulated friendship or over-personalized responses.”
Dr. Sanjai Thankachen adds that families should “investigate which tools were built by well-known and well-researched organizations” and look for “peer-reviewed articles, well-documented privacy regulations, and openness regarding the limitations of the tool.”
Stemen also encourages people to see if the tool has been independently vetted. “Look for accreditation from trusted organizations. For instance, ORCHA recommends apps that pass a quality assurance process.”
Finding out your teen is using AI for mental health support can bring up different emotions—concern, confusion, even relief. Here are some suggestions for starting the conversation productively. Use language that feels natural to you and your relationship with your teen.
Lead with curiosity, not judgment. Ask open-ended questions about their experience. What drew them to it? What’s been helpful? What hasn’t? Listen more than you talk.
Validate their initiative. Acknowledged that seeking any form of support takes courage. Recognize they’re trying to take care of their mental health.
Express your concerns as a partnership. Frame safety concerns as something you want to figure out together. Focus on making sure the tools they’re using are actually helpful and secure.
Collaborate on next steps. Discuss when AI might be appropriate and when they need human support. Make it a conversation, not a lecture.
Not every teen will be ready to discuss their AI use openly. If yours resists, here are some approaches that balance respecting their space and staying involved:
Give them time and space. Let them know you’re available when they’re ready, without forcing the conversation.
Offer information without pressure. Share what you’ve learned about AI therapy and let them know you’re happy to discuss it if they’re interested. The goal is keeping the door open, not pushing them through it.
Frame it as care, not control. Make it clear your concern is about their safety and wellbeing, not monitoring or restricting them.
Seek guidance if you’re worried. If you’re seriously concerned about their safety, consult with a mental health professional about next steps.
Your teen may need time to process before they’re ready to talk. Patience and availability often work better than persistence.
If AI therapy is part of your teen’s mental health support, here are some guidelines to discuss together.
Information sharing limits. Your teen should never share full name, address, school name, identifying photos, or financial information unless you’ve thoroughly vetted the platform together.
Supplement, don’t replace. AI should complement other support (friends, family, school counselors, therapists), not replace human connection entirely.
Regular check-ins. Schedule monthly conversations about how it's working. Is it helping? Making things feel worse? Time to adjust?
Escalation protocol. Agree in advance on situations where your teen will talk to you or another trusted adult instead of relying solely on AI: feeling unsafe, persistent symptoms, crisis moments.
Time limits. A middle-of-the-night session can interfere with sleep and create over-reliance. Consider agreeing on reasonable usage patterns together.
Think of AI as one tool in a larger mental health toolbox. It works best when combined with:
Strong relationships: Connection with family, friends, and trusted adults remains the foundation of teen mental health
Professional support when needed: For persistent concerns, a licensed therapist can provide personalized, evidence-based treatment
Physical health basics: Sleep, nutrition, exercise, and balancing screen time all significantly impact mental wellbeing
Coping skills practice: AI can teach techniques, but your teen needs opportunities to practice them in real life
School-based support: School counselors and support programs offer another layer of help
Cultural and community connections: For many teens, connection to their cultural community, faith tradition, or identity-affirming spaces is an important part of mental health support
We’re the first generation of parents and guardians navigating teenage mental health in the age of AI. There’s no playbook for this, and that’s okay. Your role isn’t to forbid or control every aspect of your teen’s mental health journey—that’s neither realistic nor healthy. Your role is to:
Stay educated about what they're using and why
Help them evaluate tools critically and use them safely
Keep communication channels open
Recognize when professional help is necessary and help them access it
Model healthy attitudes toward mental health and seeking support
Most importantly, your teen needs to know that no matter what tools they use, you're there for them. You're interested, supportive, and willing to help them find whatever support they need. No AI can replace that.
If you’re looking for more comprehensive resources, Project Healthy Minds offers a directory of mental health services, including:
Support groups specifically designed for young people
You’re navigating uncharted territory, but you’re not alone. Thousands of parents and guardians are asking the same questions, learning as they go, and discovering that staying curious and connected matters more than getting everything right.
You don’t need to be a tech expert or mental health professional to support your teen through this. You can stay curious, keep communication open, and help them build a support system that combines technology and trusted people.
That’s something AI will never replace and what your teen needs from you most.
Browse Services
The Healthy Minds Blog
Real talk on mental health, from real people and mental health experts who get it
Oct 29
Est Reading Time: 11 mins
Therapy sessions cost between $0 and $50 per session with insurance (Medicaid, Medicare, or private plans), and $90 to $300+ without coverage, but your price depends on many factors. Here’s what you need to know to budget for mental health care.
Read More
Nov 6
Est Reading Time: 11 mins
Find ways to make therapy affordable—through insurance, local resources, and other options—because mental health care should be within everyone’s reach.
Read More
This content is for informational purposes only and is not a substitute for professional advice, diagnosis, or treatment. If you're in crisis, visit our Crisis Services page or call 988 for the Suicide & Crisis Lifeline. Call 911 if you're in immediate danger.
Project Healthy Minds does not endorse any specific services or providers.
© 2025 Project Healthy Minds Corporation. All rights reserved. Project Healthy Minds Corporation is a qualified 501(c)(3) tax-exempt organization. Tax ID Number: 82-3616589. By accessing or using this website, you agree to the terms, conditions, statements, disclosures and policies in our Terms of Use and Privacy Policy.
Loading Amplitude Guides and Surveys...






