Considering AI for Mental Health Support? One Therapist’s Perspective. 

By: Kelsey Esser, MSW, LCSW

As a therapist with 15 years of clinical experience in a variety of mental health settings, I’ve witnessed tremendous shifts in how mental health support is delivered. Technology has continually reshaped the therapeutic landscape. Now, with the rapid evolution of artificial intelligence, we stand at yet another threshold—one that presents both exciting opportunities and serious considerations.

Let me share what I see as the potential benefits and risks of using AI in therapy, through the lens of a practicing clinician. Keep in mind that at this point, we have limited research on AI implications for therapeutic benefit, so this blog is my professional opinion gathered from the research I have read, what clients are sharing with me and my team regarding their use of AI, and my own experiences in using AI (specifically ChatGPT) to support my work as a therapist. 

Let’s start with the good:

Increased Accessibility

Perhaps the most powerful advantage of AI-assisted therapy is accessibility. AI tools don’t sleep, take vacations, or charge by the hour. For individuals in remote areas, with limited income, or facing long waitlists, AI-powered apps and chatbots can provide immediate, if basic, support. AI tools can offer round-the-clock support. For someone experiencing anxiety at 3 a.m., having access to a calming tool can make a real difference. 

A Place to Start When You’re Not Ready to Talk to a Human

Opening up to someone about our complex thoughts, feelings and experiences such as those involving trauma, food issues, or suicidal thoughts can feel impossible at first. AI chatbots or self-guided mental health apps offer a kind of low-pressure space—somewhere to vent, journal, or untangle your thoughts when you’re not ready to speak aloud. For some people, especially those with feelings of shame around their story, that space can help bridge the gap toward eventually seeking human support.

Support Between Sessions

AI tools are available 24/7 and can provide ideas for coping skills such as distraction techniques, breathing exercises, or validating statements when your human therapist isn’t immediately accessible. While it’s not the same as coaching with your therapist, that "in-the-moment" tool can sometimes buy just enough space to get to the next moment. And that matters.

Some AI programs offer daily check-ins, mood tracking, or gentle reminders—tools that can help bring a sense of structure to a healing process that often feels overwhelming. If you’re already working with a therapist or dietitian, these tools might even complement your treatment plan. 

Some Research on the Efficacy of Managing General Anxiety and Depression Symptoms

Some studies are showing AI has some effectiveness in managing or reducing mild anxiety and depression symptoms. While AI can be a valuable tool, it's crucial to remember that it's often most effective as a complement to traditional therapy rather than a replacement. At this time, we do not have data on long term outcomes.

Now here’s what to be careful about:

The Illusion of Relationship

Human therapy is more than a conversation. It's a dynamic, relational experience grounded in empathy, attunement, and trust. AI, no matter how advanced, doesn’t feel. It can simulate warmth, but it cannot genuinely care. When clients develop a false sense of connection with a bot, they may mistake simulated empathy for the real thing—which could delay deeper healing. There’s also increasing concern of dependency on AI to “solve” problems with a quick fix. This dependency has the potential to phase out and replace your human connections over time because you’re never met with resistance or challenge. It can limit your own capacity for problem solving. Limited human interaction and connection is detrimental to mental health. 

Some AI tools feel surprisingly comforting—especially when you’re isolated. It’s tempting to treat them like a lifeline. But relying too heavily on something that can't truly respond to your deeper needs can keep you stuck in cycles of loneliness, self-doubt, or avoidance.

Privacy and Data Security

Mental health data is deeply personal. I’m concerned about how AI apps collect, store, and use this information. Who owns your therapy transcript? What if your most vulnerable thoughts are sold, leaked, or used to target ads? Without strong oversight and clear ethical guidelines, the risks are significant.

Bias and Misinformation

AI systems learn from data—and that data often reflects human biases. There have already been instances of AI generating culturally insensitive or inappropriate responses. In therapy, where context, culture, and nuance are everything, this can be harmful. 

Not All Apps Are Safe

Unfortunately, not all AI tools are created with eating disorder safety in mind. The internet is filled with diet culture messaging and AI can pull its responses from this information. Some include calorie counters or weight-focused goals that can actually fuel disordered thinking. Others may give generic or even harmful responses to very personal struggles. This is especially risky if you're in a vulnerable place.

If you’re unsure whether a tool is recovery-aligned, look for apps designed with input from licensed therapists, registered dietitians, or eating disorder specialists—or ask your treatment provider for recommendations.

Triggers and Coping Patterns Can Slip Through the Cracks

Therapists are trained to notice subtle signs of distress, trauma, and disordered behavior. RO-DBT trained therapists, like the therapists at The Current, are highly trained and skilled in recognizing nonverbals - micro facial expressions, body movements and subtle social signaling that give us insight into your innermost thoughts, feelings, and trauma responses. AI isn’t. It can miss important coping patterns and connections to your past trauma that our therapists are trained to spot. It might not pick up tone or nuance. AI also cannot simulate the fascinating human ability to co-regulate. Co-regulation is a process where our nervous system calms down in the presence of another calm, safe human. It’s not a conscious choice—it’s a biological reaction. If you're looking for deep healing —or feeling overwhelmed—please know that AI is not a substitute for real therapists.

Risks of Worsening Some Mental Health Symptoms

Most well-designed AI models are trained to be empathetic, avoid confrontation or causing harm, mirror the user's emotional tone, and offer affirming and validating responses to build rapport and trust. This can mimic aspects of trauma-informed care or non-directive therapy, which feels good to users—but there’s a catch. You run the risk of being “over validated”. AI can sometimes reinforce maladaptive thought patterns, Avoid necessary therapeutic confrontation, which is sometimes crucial for growth (e.g., challenging black-and-white thinking), confirm cognitive distortions rather than gently disrupting them, create a false sense of being "right"—because the AI’s affirming tone may sound like agreement or endorsement. Over time, users might become more dependent on external affirmation and less comfortable with discomfort, disagreement, or challenge. This could reduce psychological flexibility—a key goal of many therapies like DBT and RO-DBT. People might develop lower distress tolerance when confronted with opposing perspectives or when challenged by real-life interpersonal feedback.
This is particularly risky for individuals: personality disorders, eating disorders, and managing trauma histories where attachment or validation was a core wound.

Situations in which AI has Caused Harm

There are a plethora of stories from people that have found AI helpful with their mental health needs. Yes, I’ve read the comments on social media where people state “ChatGPT has helped me more than my therapist ever has.” Well that’s unfortunate, possibly you weren’t seeing the right therapist?

However, there are also stories (sources linked) from people who have been harmed by AI:

In 2023, the National Eating Disorder Association’s chatbot named Tessa, which has since been shut down, was marketed as “a meaningful resource for those struggling with eating disorders”. It was found to be giving users diet and weight loss advice. 

A recent article in The Independent wrote of experiences of Open AI users being encouraged to engage in suicidal behavior, worsening psychosis and mania symptoms. 

So should you use AI for therapy?

It depends. 

If you are struggling with an active eating disorder, suicidal thoughts or urges or have experienced mania or psychosis, I strongly encourage you to not rely on AI at this time for therapeutic support. Attempting to use AI with discernment can be incredibly challenging when you are in a distorted, overwhelmed or vulnerable state. If you don’t have good boundaries and a higher level of self awareness, using AI can be risky. 

However, AI is here to stay and the mass majority of people are already using it in some capacity. I encourage approaching AI therapy tools with curiosity and caution. Use them as supplements, not substitutes. AI can offer coping tools, a place to work out your thoughts, and some level of comfort. Using them for general anxiety or depression symptoms or everyday relationship issues is probably perfectly fine. 

AI can play a supporting role, but it shouldn’t be the only one. AI can be a tool, but not a replacement for human connection. It can give you a moment of calm—but healing from eating disorders, trauma, and suicidal pain often requires being deeply seen and supported by someone who’s trained, grounded, and emotionally available.

When integrated thoughtfully, it can complement therapeutic work. But we must advocate for ethical design, transparent use of data, and clear boundaries around its role.

AI shouldn’t replace all therapists—but therapists who understand AI may very well outlast those who don’t. My hope is that, together, we can shape a future where technology enhances healing without eroding the human connection at its core.

If you're not ready to talk to a therapist or have had a poor therapeutic experience in the past, that’s okay. But don’t assume AI is your only option.  If you're already using AI tools as part of your healing, and finding some benefit, that’s great, but don’t stop there. Keep reaching—toward human connection, help, real care, and a life that feels worth sharing.

AI Isn’t Enough for Deep Healing—Here’s Why

You Need More Than Advice. You Need a Nervous System That Feels Safe.

I believe there is still a largely held misconception about what therapy is. Effective therapists do not give advice or simply provide a sounding board or validation. Healing doesn’t just happen in your mind. It happens in your body. That’s where trauma lives. That’s where eating disorders take root. That’s where despair settles in. And here’s the thing AI can’t replicate: co-regulation.

Co-regulation is a process where our nervous system calms down in the presence of another calm, safe human. It’s not a conscious choice—it’s biology. Our brains are wired to look for facial cues, tone of voice, and attuned presence to signal that we're okay. AI doesn’t have a body. It doesn’t breathe. It doesn’t attune to you.
And without that, it can’t offer what your nervous system might need most: real connection for real regulation.

Similarly, neural reciprocity—the back-and-forth exchange between two people’s brains—is what helps rebuild trust, especially after trauma. It's what allows a therapist to notice your flinch, your silence, your breath. AI can’t track those things. It can’t say, “Hey, I noticed you looked away just now. Can we slow down?”

Those moments of human responsiveness might seem small—but they’re everything in trauma recovery.

This is the type of therapy that the therapists at The Current can provide you. Deep attunement, the ability to truly see you, and alignment on your therapeutic goals that actually guide you towards healing. You weren’t hurt in isolation. You won’t fully heal in isolation either.

If You’re in Pain Right Now, Please Don’t Wait for Perfect Words

If you're thinking about suicide or feeling too overwhelmed to cope, you don't need the perfect sentence. You just need to say something—to someone who can hold it.

AI might help you take the edge off the hard moments—but it won’t sit across from you and witness your truth. It won’t notice the tears you didn’t name. It won’t hold a safe pause while your nervous system recalibrates. People do that.

Your healing deserves the rhythm of breath, the quiet of presence, and the safety of someone staying with you, even when you’re hurting.

So yes—use the app if it truly helps. Ask yourself if you know the difference. Type to the bot if it gets you through the night. But don’t stop there. Keep reaching for human connection. 

Next
Next

Body Image - How Can I Improve It?