In 2025, more and more people are turning to Aritificial Intelligence (AI) tools like ChatGPT, chatbots and apps thinking they might replace or support therapy – especially when “Big Tech” therapy companies advertise AI as a perk of using their services. I get it: AI is easy to talk to, (mostly) free of fees, and super accessible – it’s literally built into our phones with features like Gemini and Apple Intelligence. But when it comes to emotional healing, it’s safety and personal connection that matter most, and that’s where AI falls behind.
In this post, we’ll explore how clients can safely use AI in therapy, when to avoid it, and how therapists should – and shouldn’t – integrate AI into care. At Keystone Therapy Group, we believe technology can assist our work with our clients, but it should never replace the human heart of therapy.

What Clients Should vs Shouldn’t Do With AI in Therapy
AI is everywhere now – in chatbots, mental health apps, and mood trackers – and there’s real potential for support, education, and convenience. For many people, using AI can feel like a first step in healing. Maybe they can explore anxiety symptoms, get reminders for coping strategies, or learn more about how therapy works before trying it. But because AI lacks human empathy and relational nuance, there are important boundaries you need to honor for yourself. Using AI safely means knowing when it helps and when it could potentially hurt. AI should support your mental health instead of creating confusion or distress.
What Clients Can Safely Use AI For
- Tools that support therapy like journaling prompts, a thought organizer, or mood tracker to notice patterns in your anxiety, stress, or thoughts.
- Reminders for coping skills like breathing exercises, grounding tools, or scheduling self-care.
- Finding specialists by researching a local therapist or support group who specializes in what you’re going through.
- Psychoeducation to learn about different therapy models (like CBT, ACT, or EFT), or best-practice treatment options for your struggles.
What Clients Should Not Rely on AI For
- Diagnosis, treatment planning, or prescriptions – these require a licensed therapist or psychiatrist who is specially trained and licensed to provide these services.
- Crisis intervention for suicidal thoughts, self-harm, or emergency mental health situations. AI is not equipped for safety and crisis response.
- Deep emotional or relational work. Processing trauma, attachment wounds, or identity questions demands human connection and can only be done safely while you are connected to a safe, responsive trained professional who truly sees you.
- Private or sensitive disclosures of any kind. Even if you know how the AI app handles data privacy, nothing online is ever 100% safe from breach or unintended sharing.

What Therapists Should vs Shouldn’t Do With AI in Practice
Therapists increasingly face questions about integrating AI into our practice. When done well, AI can lighten the load of administrative tasks, help deliver psycho-educational content more efficiently, or provide supplemental tools between sessions, like additional resources or worksheets. But when therapists lean too heavily on AI by letting it drive clinical decisions, recording sessions to simplify writing notes, or replacing the lived human connection, the value of therapy diminishes. Responsible use means preserving empathy, safety, and patient trust above convenience.
What Therapists Can Do Responsibly With AI
- Use AI to automate non-clinical tasks: scheduling, reminders, marketing, or emails.
- Develop psycho-educational materials like handouts or worksheets using AI that is then reviewed and customized by the therapist.
- Use AI to generate reflection prompts or self-help exercises clients can use between sessions.
What Therapists Should Not Do With AI
- Let AI make clinical decisions (diagnoses, treatment plans, intervention choices) without therapist judgment. Therapists should never put their clients’ private health information (PHI) into any AI or chatbot.
- Use AI to record client sessions and generate progress notes or other documentation, especially without client consent.
- Use AI to replace therapeutic presence, emotional attunement, or human listening.
- Rely on AI for crisis risk assessments or emergency mental health judgement.
- Present AI content as fact or flawless, or mislead clients about its origin or limitations.

How AI Tools Can Support (Not Replace) Real Therapy
AI can be a helpful companion to therapy, but it’s not the same as being in therapy. Think of it as a supplemental tool, not a substitute. The power of therapy comes from authentic connection, empathy, and the back-and-forth reflection that happens between client and therapist. AI is designed to mirror the same language that is put into it. It can’t replace your therapist’s understanding of your history, share intuition into your emotions, or safely challenge your unhealthy thoughts. Instead, AI can enhance your progress in therapy between sessions.
Ways AI Can Support Your Mental Health Between Therapy Sessions
- Reflection & Journaling: You can “brain dump” your thoughts into AI and ask it to organize and summarize your thoughts. Use AI journaling apps to capture thoughts or patterns that you can then bring into therapy to use in your work with your therapist.
- Skill Reminders: You can use AI tools to set up reminders for you to use mindfulness, grounding, or breathing exercises that you learn in therapy. Coping skills are most effective when you practice them regularly “off the court” when you’re outside of therapy.
- Education: Chatbots or AI assistants can teach you about stress cycles, trauma responses, or relationship dynamics – giving context to what you discuss in sessions.
- Goal Tracking: AI can help you notice patterns and track goals, like mood changes or sleep improvement. This can provide you with data and trends to bring into your work with your therapist.
Why Human Therapy Is Still Essential
- Empathy and intuition can’t be coded. If you treat AI as a therapist, it will only mirror what you feed it and use pattern recognition to give you the same cookie-cutter responses others are getting online.
- AI will never actually understand you. Your therapist is able to connect with you by tapping into our own emotional, human experiences. AI can’t do that.
- Your therapist helps you connect past and present experiences, something AI can’t interpret.
- Therapy is about relationship repair – the process of being understood by another person. Healing occurs within a safe connection with another person where you feel safe, heard, and valued. AI isn’t a relationship – it’s a program.
- Real connection with a therapist helps you practice emotional safety and trust, key for healing trauma and anxiety. While AI might feel more comfortable, it doesn’t actually create a sense of safety within yourself and trust in others.
At Keystone Therapy Group, we blend modern tools with authentic, person-to-person care. Our human therapists guide you through deep healing work – not scripted sessions, parroted statements, or automated feedback.

Privacy & Data Risks When Using AI for Mental Health
One of the biggest concerns with AI mental health apps is data privacy. When you share personal details with an AI chatbot or app, that data may be stored, analyzed, or even sold. Unlike licensed therapists, AI tools aren’t bound by HIPAA or strict confidentiality standards. Many users don’t realize how much personal information they’re giving up in exchange for convenience with using AI. Remember: if it’s on the internet, it’s always on the internet.
Risks of Using AI Mental Health Tools Without Safeguards
- Personal data could be shared with advertisers or third-party platforms.
- Apps may store chat history indefinitely, even after you stop using them.
- Sensitive disclosures could be exposed in data breaches (this has happened!).
- Few AI companies explain clearly how your information is encrypted or anonymized, and they may not even have those safeguards in place.
How to Protect Your Mental Health Privacy
- Read the app’s privacy policy before entering any personal information.
- Avoid sharing identifiable details like your full name, address, or specific trauma history.
- Use AI tools that clearly state HIPAA compliance or data protection standards, but stay vigilent that all data online can be hacked, unprotected, or breached, even if HIPAA compliant.
- When in doubt, keep vulnerable or emotional disclosures within real therapy. Yes, humans make mistakes, but therapists are highly regulated in how we use, store, and share personal health information. AI doesn’t have those safeguards or regulations in place. Sharing your sensitive information is a huge risk and should be done very intentionally.
At Keystone Therapy Group, your privacy is never automated. All sessions are confidential, whether you meet virtually or in-person. Your therapist – not a tech company – safeguards your story. We don’t cut corners by using AI to record sessions or use AI to write progress notes for us.

The Future of AI in Therapy: Ethical, Relational, and Human-Centered
The future of therapy may include AI, but the heart of therapy will always be human. As technology evolves, the goal isn’t to replace therapists. It’s to enhance accessibility and support while protecting connection, ethics, and trust. Ethical use of AI in therapy focuses on empowering both clients and therapists, not automating care.
A Healthy Vision for AI in Mental Health Care
- Accessibility: AI tools can provide tools for clients to learn and use coping skills, organize thoughts, and search for specialized resources for their therapy needs.
- Efficiency: AI allows therapists to create personalized tools for their clients, like worksheets and journaling prompts. Therapists can also use AI to generate emails, automate scheduling, or generate marketing materials.
- Collaboration: AI can assist both clients and therapists with tracking progress or supplement treatment plans, but always under therapist supervision.
Why Ethical Oversight Matters
- Clients should know when AI is being used and what its limits are.
- Therapists must keep informed consent and transparency front and center.
- True healing depends on trust, and trust comes from people, not programs.
At Keystone Therapy Group, we believe the most powerful tool in therapy is still the human connection. We combine science-based care with authentic relationships, helping you heal from trauma, anxiety, and relationship challenges, one human conversation at a time.

Conclusion: Finding the Right Balance Between AI and Human Therapy
AI can support your mental health journey, but it can’t replace the empathy, expertise, or emotional depth of a trained therapist. When you’re struggling with trauma, anxiety, or relationship challenges, what heals most isn’t information – it’s connection.
At Keystone Therapy Group, our therapists offer evidence-based care rooted in compassion, not algorithms. Whether you meet with us virtually or in-person in our Burke, VA office, you’ll find a safe, warm space to explore what’s really going on and move toward lasting change.
👉 Learn more about our approach to authentic, evidence-based therapy:
FAQs About AI in Therapy
No. AI can’t provide empathy, nuance, or human connection – all of which are essential for healing. At Keystone, we use evidence-based therapy with specialized therapists who understand your story and emotions on a human level.
Not always. Many AI apps aren’t bound by HIPAA and may store or sell user data. It’s safer to process sensitive experiences directly with your therapist, where privacy and confidentiality are guaranteed.
You can use AI to journal, learn new skills, or remind yourself of coping tool, but always share what you’re exploring with your therapist to make sure it aligns with your goals.
Our therapists may use AI tools for scheduling or psychoeducation, but never to make treatment decisions, cut corners on our tasks, or replace therapy. We prioritize authentic connection and ethical care.
That’s normal – many people try self-help tools first. We can help you build insight, emotional safety, and practical strategies that AI alone can’t provide. Schedule a session to get started.






