ChatGPT Wellness Trends: The Do's and Don'ts of AI Therapy

A woman in a mustard yellow shirt sits in a comfy therapy chair with her laptop closed

The world is changing faster than any of us could ever have predicted. Just in the last 20 years, our ability to access information (and disinformation) has shifted from a trip to the library to now having all the world’s knowledge in our personal pocket-sized computers.

There is no denying that technological innovations have benefited our lives, but we’re also at a turning point in history where we must make conscious, intentional decisions. This is the time to realize what innovations are for our benefit and which are for our detriment. And while both can be true at the same time, as a licensed professional counselor in both Delaware and Tennessee, the rise of AI and LLM chat bots has given me great reason to pause. 

AI is now woven into nearly everything we do. Whether you're intentionally logging into ChatGPT or Claude, searching your browser, or sitting on hold with your insurance company, AI is there. And increasingly, it's showing up in our mental health journeys too. 

I’ve had friends and family ask me,

Is AI safe for anxiety? Are you worried ChatGPT will replace therapists? Can I use AI chatbots for therapy?

And they aren’t alone in the curiosity. Many of my clients have admitted to using AI to support their wellness between sessions. And while I'm genuinely glad they're taking action to feel better, it's more important than ever to be honest about what AI can and cannot do for your mental health.


My Personal Thoughts on AI Therapy

Before I give you the practical guidance, I want to be transparent about where I stand, because I believe you deserve to know the full picture. 

One of my core values as a therapist, and as a human being, is alignment. If you’re one of my clients, you know this is something we talk about quite a bit. When we live or work outside of our values, we experience burnout, anxiety, disconnection, and a loss of meaning. That principle applies to me too.

One of my deepest values is taking care of our planet and our communities, and this is where AI gets complicated.


The Environmental Cost of AI

When we talk about AI and wellness, we have to take into consideration the environmental toll of these systems. 

Every time you open a chatbot, send a prompt, or generate a response, you are drawing on data centers that consume enormous amounts of electricity and water to keep their servers cool. The carbon footprint of AI is growing rapidly, during a time where our Climate Clock is getting dangerously close to 0:00. The massive data centers, often built in communities that receive little economic benefit while suffering the environmental burden, create irreversible harm that is being too easily overlooked in the discussion.

If environmental protection is part of your value system, it is worth asking yourself: 

Does using AI regularly align with those values? 

That's not a question designed to shame you. It's the same question I ask myself. It's the kind of values-based reflection that is actually at the heart of good therapy.


Effects on Your Mental Health

AI chatbots are optimized to keep the conversation going. 

They aren’t designed to guide you towards your truth or to challenge your limiting beliefs. They are designed to validate and to optimize helpfulness, where licensed mental health counselors like me optimize sessions for long-term wellbeing and safety.  These are fundamentally different goals. Mental health should not be in the hands of pattern-predicting software with no accountability.

There is also the growing concern of what researchers are calling AI psychosis: a pattern where people become so enmeshed with AI companions that their sense of reality, their relationships, and their emotional regulation begin to deteriorate. Dependence on AI for emotional support can feel soothing in the short term, but quietly erodes the very skills and connections that support long-term mental health, leading to even more loneliness. I’ve even read about support groups being formed for those breaking the addiction and learning to live without their Chatbot companions.


Healing Happens in Relationship

I know how tempting it can be to lean into support, especially when it’s just a click away. Intuitively, we know that the AI bot won’t judge us (since it doesn’t have a mind of its own), so in a way, it feels safe to be “seen.”

But here's a distinction I want you to sit with: being seen is one-sided. You can be seen by a camera or a mirror. 

Being witnessed is something else entirely. To be witnessed is to be received by someone who is present with you, changed by you, and capable of reflecting something true back to you. Witnessing requires a conscious other; It requires relationship.

In my years of counseling education and practice, there is one thing I know to be true more than anything else: humans heal through relationships.

A deep connection with ourselves, rooted in honesty, acceptance, and willingness, has the potential to shift your reality dramatically. And the ongoing relationships we have with others, whether platonic, professional, romantic, or familial, give us a chance to be witnessed and received. Instead of being analyzed by a siloed machine that has been programmed by broken and oppressive systems of governance and society, you get to do the work human-to-human.

This is why the therapeutic relationship isn't just the setting for healing, it is the healing. When you sit across from a therapist who is genuinely present, who can hold your pain without flinching, or who can challenge you with care and celebrate your growth with honesty, something shifts in your nervous system that no machine can replicate.


But I've been a virtual therapist long enough to know that not everyone has the time, capacity, or financial resources to book an appointment with a licensed counselor. So while I would love to see AI growth slowed and regulated, I know people will continue to turn to chatbots for support. So let me offer you some practical advice.


The Do's and Don'ts of Using AI for Your Mental Health

DO'S

Use AI to generate journaling prompts. Journaling is a powerful tool for processing emotions, but staring at a blank page can be paralyzing. Ask AI for prompts tailored to the situations you’re going through.

Ask for practical next steps. Let AI come up with some practical next steps to help you feel into an ongoing situation, lessen your anxiety, or to help you reach an upcoming goal.

Use it to create personal worksheets or reflection tools. AI can help you build structured tools for self-reflection, like a weekly mood tracker, a values clarification exercise, or a list of grounding techniques. Think of it as a template generator, not a therapist.

Ask for a meditation or breathwork script. If you need something to listen to or read during a stressful moment, AI can produce guided meditation scripts that are low-risk and often genuinely calming.

Ask for an outline for a hard conversation. If you’re having a hard time deciding what to say, ask your bot to help you organize your thoughts and provide you with an outline of talking points.

Use it to bridge the gap between therapy sessions. If something comes up between appointments and you need to process it before you see your therapist, AI can be a place to think out loud (as long as you bring those reflections back to your actual therapist).

Use AI to gather resources and research. Looking for more information about grief counseling or therapeutic cancer support? Ask what types of therapy have been found most useful, or search who offers specialized support in your area.

Set boundaries on your usage. Decide in advance how long or how often you'll engage with AI for emotional support, and stick to it. Treat it like any other wellness tool: useful in moderation, harmful in excess.


DON'TS

Don't take everything it says as truth. AI does not know you, no matter how much background information you feed it. It does not have access to your nervous system or your non-verbal cues. Its responses may sound confident and caring, but they are pattern-matched outputs, not clinical assessments. Always bring what you're reading back to a real human who knows you.

Don't use it to replace the therapeutic relationship. The connection and relationship between a therapist and client is itself the mechanism of healing. Feeling genuinely known and witnessed by another human being is something no chatbot can replicate. If you find yourself preferring your AI conversations to your therapy sessions, it’s worth exploring why with your therapist.

Don't use it during a crisis. If you are in acute distress, experiencing thoughts of self-harm, or feeling disconnected from reality, please reach out to a crisis line or a licensed professional. AI is not equipped to assess risk, and it is not a substitute for immediate human support.

Don't assume it understands your identity. Remember that AI is built on biased data. If something it tells you doesn't feel right, trust that instinct. Your experience is the authority, not the algorithm.

Don't let it become your primary emotional relationship. It can feel good to be heard, even by a machine. But AI is not capable of genuine attunement, and over-reliance on it can actually increase loneliness and erode your capacity for real human connection.


The Bottom Line

I know that times are changing and AI is likely not going anywhere, despite concerns from experts in many fields. So it’s important to view it as a tool only. 

A tool’s value depends on how you use it. Used intentionally and within limits, it can support your wellness journey in small but meaningful ways. Used as a replacement for real therapeutic work or real human connection, it carries genuine risks.

You deserve support that fully sees you, challenges you honestly, and is accountable to your wellbeing. That's what therapy is built to do. And no chatbot, no matter how advanced, has been designed to do that.

If you're ready for real-human personalized support, I'd love to connect.

Previous
Previous

How To Deal with Disappointment: A 5 Step Guide to Navigating Strained Relationships

Next
Next

Therapy: Fact Versus Fiction