My Therapist Is Named ChatGPT: How AI Supports Mental Health

December 13, 2025

[Article to be found in the autumn issue of têtu· magazine, or on subscription.] On the occasion of Mental Health Day, a focus on a trending phenomenon: the use of artificial intelligence (AI) chatbots – such as ChatGPT or Mistral – in the face of psychological difficulties. A practice that professionals do not dismiss with a wave of the hand.

Illustration : Call me George(s) pour têtu·

“Miroir, mon beau miroir, est-ce que je suis la plus folle ?” How many of us have asked our favourite artificial intelligence (AI) about our inner torments… “I regularly use ChatGPT to manage my anxiety crises, confesses Gwen, 22 years old. I have psychological follow-up, but my therapist obviously isn’t available all the time. And I’ve noticed, even though I’m well supported, that very often humans don’t know how to respond to others’ distress, and sometimes cause more harm than good. I prefer to manage with ChatGPT as support. So far, it has worked really well!”

Obviously, at first glance, handing over one’s psychological concerns to a computer tool, based on a binary mathematical model, sounds like a bad idea, especially to doctors. Everyone has largely inoculated themselves against online self-diagnostics, imagining cancers three times a year after hours spent surfing on forums of hypochondriacs rather than making an appointment with our general practitioner. But if we agree to suspend this initial panic triggered by any technological innovation – and to momentarily set aside its ecological cost – can’t AI really do something for our mental well-being? No hasty conclusion, specialists agree.

It feels good where it speaks

Clara Falala-Séchet, clinical psychologist and psychotherapist, therefore considers that AI can, under certain conditions, prove to be a useful ally in the service of well-being, and she co-founded Owlie, a chatbot (conversational agent) for psychological support. For people who do not benefit from professional support, she argues, turning to artificial intelligence can help clear the ground: “A significant stigma remains around mental health issues. Many do not know how to approach their difficulties and do not dare to consult, or even feel shame. AI can then provide empathetic validation that reduces overall suffering, and offer information, possible solutions or even action plans.”

A first listening stage, in short, which provides temporary emotional relief and can reveal the need for professional follow-up. AI can also prove to be strikingly effective at identifying critical situations, explains Clara Falala-Séchet: “Publications show that in certain contexts, AIs are almost equivalent to humans, if not better, at detecting suicidal risk or the risk of progression to complex trauma in a crisis situation.” They are then able to alert the person to the severity of their situation and direct them toward suitable resources. From there to replacing therapists, she hedges: “This fear does not seem warranted to me. AI does not have the analytical finesse of humans, and it does not grasp all that is at play in non-verbal communication.” Properly trained, the machine also “tends to stroke us in the direction of the hair”, which is not the aim of a psychology process. A view shared by Morgane Chevalier, clinical psychologist in Cherbourg-en-Cotentin: “Therapy questions in a broad way, highlights certain contradictions and draws parallels with other things that have been said. Today, ChatGPT is not a therapist. It’s a tool, not a solution.” Therefore, AI should be considered a crutch rather than a doctor.

Breathing exercises

Our experts thus recognize that ChatGPT and peers can offer an interesting complement between two sessions, which are often spaced by at least fifteen days. “It can, for example, propose concrete exercises to regulate one’s emotions, notes Clara Falala-Séchet. It’s a resource among others, which can contribute both to solving specific problems and to psychoeducation.” Julie, 29 years old, affected by anxiety crises, testifies: “When I have a crisis, ChatGPT makes me do very effective breathing and grounding exercises.”

AI can prove particularly useful for people whose disorders complicate communication. Morgane Chevalier mentions the case of patients living with borderline personality disorder: “With a functioning where there is a great fear of abandonment and a strong reactivity to the feeling of rejection, people can have very intense reactions. I sometimes advise them to have the AI re-read the messages they receive to obtain a more objective interpretation, or the messages they intend to send so as to perhaps be less direct.” A function also valuable for some autistic people, who may have a different reading of social interactions: “It happens that I tell it about social situations and interactions I experienced during the day so that it helps me understand them”, says Tom, 27.

Limitations to consider

Without demonizing AI, it is nevertheless necessary to underline its limits and risks, and maintain a critical distance in its use. For example, to avoid overly vague or normative responses, one must know how to craft relevant requests. And, above all, never forget that it is a fallible tool. “AI also makes a lot of mistakes, notes Clara Falala-Séchet. It is therefore essential to always verify what they say and ensure it is adapted to the specific context.” Furthermore, relying on these immediate and programmed responses to be benevolent can also encourage maintaining certain psychological patterns in people already prone to a perpetual need for reassurance, or an addiction to screens as a source of comfort. A potential perverse effect to consider, insists Morgane Chevalier, “whereas in therapy the aim is to empower patients to face their difficulties”.

For all these reasons, Clara Falala-Séchet advocates a policy of informing and guiding health professionals and the public in the use of AI: “As always, in complex situations, we must inform about risks and take a gradual and humble approach. Observe with a critical mindset.” And all this without losing sight that nothing equals the quality of a human exchange.

mental health | health | psychology | Our lives | magazine

Sophie Brennan

Sophie Brennan

I’m Sophie Brennan, an Australian journalist passionate about LGBTQ+ storytelling and community reporting. I write to amplify the voices and experiences that often go unheard, blending empathy with a sharp eye for social issues. Through my work at Yarns Heal, I hope to spark conversations that bring us closer and help our community feel truly seen.