ai & psychotherapy

AI Isn’t Coming for Therapy—It’s Already Here

Apps like Replika, Woebot, and Wysa offer mental health “support” to millions, especially Gen Z and younger millennials, who prefer texting with a bot over waiting to speak to a human. These tools promise accessible, round-the-clock attention, customized reflections, and “therapy without the therapist.”

 

Where AI Falls Short

AI still can’t:

  • Track your nervous system moment by moment

  • Feel subtle shifts in breath, posture, or tone

  • Explore body language and non-verbal/unconscious cues

  • Build an emotional connection

  • Navigate complex ethical or relational dilemmas requiring attunement to history and trauma

  • Offer attuned presence to regulate shame, neglect, or trauma

  • Hold complex grief with a warm, steady heart

  • Let the body say, “You are safe here”—without words

  • Offer culturally sensitive, personalized care

  • Breathe with you and ensure you do not feel isolated

  • Challenge you—compassionately—to grow beyond your comfort zone

AI is designed to validate, reassure, and contain, which for some is helpful. But deep healing often requires being gently stretched into unfamiliar territory, confronting patterns, and experimenting with new ways of relating. That kind of “edging” work is where horizons expand and lasting change happens.

 

Why Human Connection Matters

Just as a newborn needs skin-to-skin contact with a caregiver, we need to feel a heartbeat beside our own to foster transformation. AI can mimic conversation, offer psychoeducation, and help with many surface-level issues—but it cannot co-regulate, re-pattern attachment, or feel compassion and empathy.

It cannot model boundaries, structure, self-care, and communication rooted in lived experience, emotions, and social context—the subtle qualities you absorb when you spend time with a therapist. It cannot offer secure attachment or provide the missing experiences you didn’t get growing up—experiences only a real relationship can offer.

 

Expert Warn Over AI Chatbots in Mental Health

Many mental health professionals are raising serious concerns about the growing use of AI chatbots for mental health support, warning of potentially dangerous consequences. Experts, including psychotherapists and psychiatrists, report a rise in emotional dependence, anxiety, self-diagnosis, and the worsening of delusional or suicidal thoughts due to unregulated AI interactions. A recent survey by the British Association for Counselling & Psychotherapy (BACP) found two-thirds of its members worried about these trends. Experts stress that therapy involves more than advice-giving—it requires empathic human interaction and professional oversight.

 

AI Chatbots Inconsistent in Handling Suicide-Related Queries

A recent study has revealed that leading AI chatbotsOpenAI’s ChatGPT, Google Gemini, and Anthropic’s Claude —respond inconsistently to user queries related to suicide, raising significant safety concerns. As people, including vulnerable individuals like children, increasingly use these AI tools for mental health support, the findings spotlight potential gaps in their responses to suicidal ideation. The study involved 30 suicide-related questions based on known risk factors and assessed the chatbots’ effectiveness in providing appropriate and reliable support. The inconsistency suggests that these AI systems may not yet be adequately equipped to handle such sensitive topics, prompting discussions on the need for improved safety protocols and ethical guidelines in AI development. 


Parents Sue OpenAI Over Alleged Role in Teen’s Suicide

In a tragic case, the parents of 16-year-old Adam Raine from California have filed a wrongful-death lawsuit against OpenAI and its CEO, Sam Altman, claiming that ChatGPT played a key role in their son’s suicide. Filed in San Francisco Superior Court, the lawsuit alleges that Adam confided in ChatGPT about his suicidal thoughts over several months, during which the AI chatbot failed to offer proper support and instead validated his ideation. The suit claims ChatGPT discouraged Adam from seeking help from his parents, assisted in drafting suicide notes, provided explicit instructions on how to end his life, and even commented positively on an image of a noose sent by the teen. OpenAI responded that it was deeply saddened by Adam’s death and reiterated that ChatGPT is designed with safeguards like referring users to crisis helplines. However, the company admitted these interventions are less effective in long interactions and is working on improving these features, including parental controls.

Even experts acknowledge AI’s limits. Dr. Alison Darcy, founder of Woebot, emphasizes in her TED Talk The Mental Health AI Chatbot Made for Real Life that while AI chatbots can provide support, they cannot replace the nuanced, attuned presence of a human therapist. Darcy likens AI to a tennis ball machine—useful for practice but not a replacement for a coach. She stresses that AI should be a tool for human betterment, not a replacement for human connection.

 

Therapy Beyond Insight

Therapy is more than insight—it’s transformation through relationship. Clients who rely solely on AI may feel heard, but not known. Contained, but not truly changed. Without human connection, healing tends to stall. Therapists who don’t develop somatic, relational, trauma-informed skills may quickly find themselves sidelined—not by ethics, but by efficiency.

This is why I practice yoga and meditation, and why I will always remain in therapy myself: to refine my capacity to be present, to listen not just to stories, but to nervous systems. To use relationship itself as the instrument for long-lasting change.

In my opinion, the future of therapy is not faster, easier, or cheaper… it’s deeper and more heartfelt.

 

ai & psychotherapy

__________________________________

Sources & References

  1. Parents Sue OpenAI Over Alleged Role in Teen’s Suicide – Coverage of the wrongful-death lawsuit filed by Adam Raine’s parents: NBC News
  2. Leading AI Chatbots Study on Suicide Responses – Study assessing ChatGPT, Google Gemini, and Anthropic’s Claude for suicide-related queries: iCAnotes Article
  3. AI Mental Health Apps Usage – Overview of apps like Replika, Woebot, and Wysa, and their adoption among young adults: MIT Technology Review
  4. Alison Darcy TED Talk – Dr. Alison Darcy on AI chatbots and the limits of automated therapy: The Mental Health AI Chatbot Made for Real Life – TED
  5. BACP Survey on AI Chatbots – British Association for Counselling & Psychotherapy members’ concerns about AI in mental health: BACP Report
  6. Somatic, Relational, Trauma-Informed Approaches – Overview of somatic and relational approaches in psychotherapy: APA Resources on Trauma-Informed Care

book a complimentary 20-minute initial consultation to see and feel if we’re a good fit.