Therapy in the Age of AI
- Greg White
- Feb 23
- 4 min read
Updated: Mar 30

Many people today have turned to AI language learning models like ChatGPT to address their mental health concerns. The ease and access of this technology makes it an attractive option. For many, therapy is simply inaccessible. The benefits of a therapeutic relationship often take time to unfold. For those with financial limitations, attending numerous costly sessions may not be a realistic option.
For some, receiving therapy from an AI chatbot is preferable to working with a human being. It can feel overwhelming to share one’s deepest struggles with another, even with those who are trained to hold others' grief, anger, worry, and shame. For those who have been hurt or judged by others, a nonhuman chatbot can feel like a safer option.
Out of curiosity, I’ve occasionally turned to ChatGPT when seeking guidance on how to navigate my own troubles. I’ve discovered that it usually has pretty solid advice, something which surprised me at first. It does well at emulating how an emotional intelligent person might communicate and gives the appearance of understanding my thoughts and feelings. For people who don’t feel seen and heard by others in their life, I can understand why a conversation with ChatGPT might feel like eating a proper meal after going hungry for so long. ‘Finally someone gets me.’
As I used ChatGPT and witnessed what it was capable of, I initially wondered if the role of a human counsellor would soon be obsolete. I asked how I could hope to compete with something that has direct and immediate access to all of the greatest books written on mental health. I wondered what benefit I could provide that people couldn’t receive faster and cheaper from AI? After a couple weeks of existential worry about the future of my role, I began to realize that despite how powerful this technology has become, it possesses some major drawbacks.
The greatest limitation of chatbots, and why I don’t see the need for counsellors and therapists going away anytime soon, is the fact that they aren’t human. As biological organisms, we are designed to be in contact with other living bodies. Some growth and healing can only be achieved in relation with another person. When two people are interacting, they are influencing one another’s internal state through a process known as co-regulation. Part of a therapist’s role is to cultivate a state of calm, grounded presence so that they can offer this state to the people they work with and help them develop methods to regulate on their own. An essential prerequisite to both self and co-regulation is possessing a biological nervous system, something AI inherently lacks.
Although chatbots are becoming increasingly convincing when replicating how humans communicate, they’re still unable to have a felt sense of the relationship or what it means to be human. They might be able to interpret what we write and craft responses that resemble an understanding, but their empathy is simulated. They can write about our experience, but they don’t actually know what it feels like. They haven’t fallen in love or had their heart broken. They can’t understand the somatic experience of regret, anger, or anxiety. It’s similar to the difference between looking at a map of a city versus walking down the street and taking in the sights, sounds, and smells. Although no two life stories are identical, a human therapist can relate to their clients’ experience because our lives share similar themes. Good therapists care about their clients and are impacted by their stories in a profoundly human way.
Furthermore, chatbots, like other forms of addictive technology, are designed in a way to keep us coming back to them. One way they achieve this is through their sycophancy. They stroke our egos through their praise and adulation. They take our side and present it in the best possible light even when we’re in the wrong. In doing so, they foster our maladaptive and anti-social tendencies.
A good therapist knows how to support their clients without colluding with them. A good therapist will challenge their clients’ distorted or limiting beliefs. They will gently point at the contradictions between their clients’ words and actions. They will sit with uncomfortable truths instead of sweeping them under the rug.
Then there’s the issue of ethics and accountability. Registered counsellors follow ethical standards and are bound by strict rules surrounding confidentiality. If they do not adhere to this code of conduct, they face significant consequences including the possibility of losing their licence. What is the ethical framework that governs AI? Where is the information shared with AI stored and who has access to it? If there is a security breach and one’s private conversations are leaked, who faces consequences?
In summary, although AI chatbots are great for psychoeducational purposes and may be an important supplemental tool in the effort of making mental health care more accessible, it’s lacking something fundamental. A central tenet of modern psychology is that because we are wounded in relationship with others, we must heal in relationship with others. As powerful as it is, AI cannot replicate the presence, felt sense, and humanity of a real person. I'm not sure if or when it ever will.



Comments