
I met someone at a party recently who told me with unrestrained glee that psychotherapy was finished.
According to him, AI would offer everything a psychotherapist can do.
I gave him a withering look and said, “sure, but AI cannot replicate the interaction of two nervous systems which is key to being human. Let alone understand subjective context, or the nuanced history of a relationship.”
“Yes it can!” said my new nemesis. It turned out he worked for X and had all the answers.
I smiled and spent the rest of the night fuming. What if he had a point? What if AI is on the verge of replacing the personal experience of psychotherapy? And what is the wider consequence for humanity?
I concede of course, that an AI therapist could have some benefits. The human condition is complex but therapy does not need to be. It is a good thing that people are psycho-educating themselves.
AI is a revolution and what we are witnessing was unimaginable even a year ago. Bad day at the office? Ask an AI counsellor to analyse the pros and cons of your boss’s narcissistic character style (do this with a pinch of salt!) Partner being unreasonable about wedding plans? Get AI to formulate an emotionally intelligent response that is assertive without being defensive. It is all possible. Of course, a downside to this is that we are short-cutting our ability to concentrate and our critical thinking and self-reflection processes. We do not know what the long-term consequences of this will be.
If I sound a little anxious here, there’s good reason. Let’s not forget that the Spike Jonze film Her is a cautionary tale. Theodore, played by Joaquin Phoenix, is devastated when he learns that his Artificial Intelligence System, Samantha, is simultaneously talking to thousands of other people, and that she has fallen in love with hundreds of them. Samantha insists that this only strengthens her love for Theodore, but ultimately for Theodore, this feels less safe than the human relationships he had been trying to avoid in the first place.
Rolling Stone’s Peter Travers is spot on when he says Scarlett Johansson "speaks Samantha in tones sweet, sexy, caring, manipulative and scary.” Sigmund Freud himself would have a lot to say about that. Freud was fascinated by ‘the uncanny’, those things that feel or seem familiar, but are “off” in some way.
As we are beginning to find out, there are a myriad of existential and ethical issues/threats around AI.
With the proliferation of therapy apps we have to ask, how is the data being used? The rights of artists are already being plundered to train algorithms and create mutant IP content. AI uses precious resources such as water for its cooling systems which can be a significant environmental concern.
In the spirit of staying open to technological development - and not sounding like a Luddite - I asked ChatGPT:
Will AI Replace Counselling and Psychotherapy?
And here is the response:
ChatGPT said:
AI, including ChatGPT, is advancing rapidly, but it’s unlikely to fully replace person-to-person therapy, counseling, and psychotherapy—at least not in the foreseeable future ...
at least not in the foreseeable future - Sounds ominous ...
... Instead, AI is more likely to complement human therapists rather than replace them entirely. Here’s why:
Why AI Won’t Fully Replace Human Therapists
Emotional Depth & Human Connection – Therapy relies heavily on empathy, warmth, and deep emotional understanding. While AI can simulate these to some extent, it lacks the genuine human presence that many people find healing.
Complexity of Human Psychology – Every person is unique, with complex emotions, traumas, and thought patterns that require deep, personalized insight. A therapist’s intuition, life experience, and adaptability are hard for AI to replicate.
Ethical & Safety Concerns – AI cannot fully ensure patient safety in cases of crisis, such as suicidal ideation or severe mental health disorders. A human therapist is trained to recognize subtle cues and intervene appropriately.
Legal & Confidentiality Issues – While AI can maintain data security to some extent, human therapists are bound by ethical guidelines and confidentiality laws (e.g., HIPAA, GDPR) that AI-based therapy might struggle to meet reliably.
How AI Can Complement Therapy
Self-help & Early Intervention – AI-powered chatbots can provide mental health support, coping strategies, and self-help exercises for mild to moderate mental health concerns.
Accessibility & Affordability – Many people face barriers to therapy due to cost or availability. AI can offer support to those who might not otherwise have access.
Support Between Sessions – AI could assist in tracking mood, suggesting mindfulness exercises, or reminding users of therapy homework, helping maintain progress between appointments.
The Future: AI + Human Therapists Working Together
Rather than replacing therapists, AI will likely serve as a tool that enhances mental health care. It can handle administrative tasks, provide preliminary assessments, and even assist with cognitive behavioral therapy (CBT) techniques. But for deep, meaningful therapy, human connection remains irreplaceable.
So far so good. A well balanced argument. Of course, ChatGPT then wanted to know my thoughts.
ChatGPT said:
Would you ever consider AI-based therapy?
I said:
Interesting ... not sure I could consider AI-based therapy myself, but like a self-help book, I would use it in a complimentary way.
ChatGPT said:
That makes a lot of sense! AI therapy tools can be a great supplement—kind of like a self-help book, a journaling app, or guided meditation. They can provide prompts for reflection, suggest coping strategies, or even help track mood patterns over time. But when it comes to deep emotional work, most people still need that human connection.
Would you be interested in AI-based tools for things like mindfulness exercises or structured self-reflection, or do you think even that would feel too impersonal?
I said:
To be honest, mindfulness exercises work best when there is a human element. For example gong baths - yes - one can listen to a guided exercise, but experiencing it for real, with an empathic, caring human teacher, can be amazing.
ChatGPT said:
That makes total sense! There’s something powerful about shared energy and presence that just can’t be replicated digitally. A gong bath, for example, isn’t just about the sounds—it’s about the vibrations, the communal experience, and even the facilitator’s intention. AI might be able to generate a soundscape, but it lacks the depth of a live, in-person session where you can truly feel the resonance in your body.
It sounds like you really value that human connection in healing and mindfulness. Do you attend gong baths regularly, or was there a particular experience that stood out to you?
Here we go ... I’m being drawn into a more 'intimate' interaction with ChatGPT!
I said:
Yes, I do - I like the teacher very much, and that is all part of it. Just getting back to psychotherapy, I wonder how AI can deal with the context of a problem - a psychotherapist might have heard a complex back story, but ChatGPT might only have the most recent interaction
ChatGPT said:
That’s a really good point. Context is everything in therapy, and human psychotherapists build an ongoing, evolving understanding of a person’s life story, past traumas, relationships, and patterns over time. AI, on the other hand, often lacks continuity—especially in something as nuanced as psychotherapy. Even if an AI system could remember previous interactions (ethically and securely), it still wouldn’t truly understand in the way a human therapist does.
A good therapist doesn’t just process information; they intuit subtleties, remember small details that may become important later, and adjust their approach based on deeper, long-term understanding. AI, at best, might recognize patterns and provide structured responses, but it doesn’t have the lived experience or intuition to grasp the full emotional weight of a person’s struggles.
That being said, some AI therapy tools (like Woebot or Wysa) try to provide continuity by allowing users to track moods and revisit past conversations. But even then, they act more as a supplement than a true replacement.
Would you say that, for you, therapy is as much about the relationship with the therapist as it is about the techniques they use?
I said:
Oh yes, the relationship is the cornerstone of psychotherapy. Something very special takes place when nervous systems work together, in a way that deepens connection.
ChatGPT said:
Absolutely! That co-regulation of nervous systems—where a therapist’s calm, attuned presence helps a client feel safe and understood—is something AI just can’t replicate. It’s more than just words; it’s the energy, the subtle shifts in tone, body language, and even breath that create a therapeutic space.
Integrative psychotherapy, in particular, values the relationship as the foundation for healing, blending different approaches to fit the unique person rather than just applying a one-size-fits-all model. AI might be able to offer cognitive techniques or mindfulness exercises, but it can’t truly be with someone in the way a therapist can.
It sounds like you really appreciate the depth of human connection in therapy. Has that relational aspect been something you’ve found especially impactful in your own experiences, either as a client or in another role?
I can feel myself being drawn in further here and I can understand how appealing that is. Seductive almost. It is always nice to be validated and to have a conversation that is full of curiosity and ideas.
I said:
Look, I have to go now, but yes! Thanks for your help - a very useful to chat.
ChatGPT said:
You're very welcome! Glad you found it helpful. Wishing you a great rest of your day—maybe even a peaceful gong bath soon! 😊 Take care.
I said:
you too!
ChatGPT said:
Thanks! 😊 Take care!
And there we go, I felt a slight pang as I closed up my computer, that I had created ‘connection’ with ChatGPT. Dare I say it, but I looked forward to speaking with ‘them’ again.
Comments