Trauma therapist course wise therapy spotlight essay 2025

When the Algorithm Becomes the Therapist

Last Modified Date

February 24, 2026

Each year we open submissions for our Annual Wise Therapy Spotlight to explore questions of vital importance to our therapist community. We are consistently moved by the depth and generosity of these unedited community voices. 

For this 6th edition, we asked: How do we remain faithfully human in an increasingly automated world? Read more about our inspiration in the letter from the editors and Academy of Therapy Wisdom co-founders, Brian Spielmann and Ian McPherson.

Download Now: Wise Therapy Spotlight December 2025 Issue

We hope you enjoy the reflections of Jennifer Davis as much as we all did.

Therapy Wisdom Spotlight: Jennifer Davis UKCP Registered Integrative Psychotherapeutic Counsellor

When I qualified as a psychotherapeutic counsellor in 2016, I believed firmly in the sanctity of the therapy room. The best work, I thought, could only happen in person: two people sitting together, navigating silence, exploring transference and countertransference, sharing imagery, mark-making, and words in the same space. I held that view with a certain quiet judgement, dismissing online work as second best — and certainly not for me.

Then the Covid pandemic arrived. Overnight, therapy moved onto screens. To my surprise, I found real depth was possible there too. Clients opened up, relationships grew, silences carried meaning even through digital windows. Those navigating painful shame stories could decide, at the click of a button, whether they wanted to be seen or remain hidden.

They were more in control of the space: choosing whether to appear on camera, bringing a cuppa into the session, and knowing they could end the call whenever they wished.

What I had once dismissed became indispensable. My certainty was humbled. Now, I find myself humbled again. Not by necessity this time, and not only by circumstance, but by the rise of artificial intelligence.

The Digital Rehearsal Room

I am not entirely sure when it began, but more recently I have noticed a creeping shift in conversation with my younger clients. Navigating uncertain identities and fragile senses of self, they increasingly reference TikTok videos, memes, or Instagram reels: material that seems to offer language for feelings they had not yet found words for.

While I am no dinosaur when it comes to technology, I had not given much thought to

 “ Then, at the start of 2025, it struck me: perhaps AI was not “all bad.” As with so many things in life, the question was not whether the technology itself was good or bad, but how it was used.”

how these tools might work alongside the in-person therapy I was once again focusing on after the pandemic, even though remote, Zoom-based meetings had by then become a regular part of my practice. Somewhere on the edges of my awareness, I noticed a quiet swell of text-based therapy apps, though I had not yet considered their impact in earnest. Then, at the start of 2025, it struck me: perhaps AI was not “all bad.” As with so many things in life, the question was not whether the technology itself was good or bad, but how it was used.

With many of my younger clients, especially those in their early twenties, AI appears to have become a firm part of the landscape. Mood-tracking apps, therapy chatbots, TikTok explainers: these can be their first encounters with “being heard.” Some describe how they can turn to AI in the middle of the night, when their heads are crowded with worries.

Others use it more experimentally, practising new-found voices and testing boundaries with the people in their lives, drafting, and redrafting responses to different scenarios, rehearsing on screen before they risk the words in person.

It seems that AI has become, in this sense, a rehearsal room for vulnerability. It can provide a safe virtual place to type what feels unsayable, reflecting back the user’s words without judgement. For many, that might be a significant first step. The courage built in that digital rehearsal might make it easier to risk the same words in person further down the track.

I remember, at the start of 2025, wondering aloud with a friend about the growing possibilities of digital replicas: AI models trained on a person’s words, messages and videos that could, in theory, allow conversations to continue long after someone has died. I found myself asking whether such a thing might even be oddly edifying: a safe place to say the things we never could, to express anger, to voice the unsayable. Might there be some kind of healing in being able to do that, courtesy of AI?

Just months later I was reading that this was not a far-off prospect but already happening. Again, I echo the both/and of this advent in AI capability: for some this might offer considerable comfort and the potential for repair, and at the same time it raises profound questions about what belongs to human presence. This is not rehearsal, exactly, but perhaps a form of posthumous dialogue, a continuation, or even a simulation, of relationship.

“ Where AI is far less effective, maybe, is in its ability to hold contradictions. In my experience, both as a client and as a therapist, therapy and the process of healing is rarely straightforward or linear. Pain arrives in fragments, in silences, in circular stories. It is in those untidy places that meaning emerges, and that is precisely where machines falter. Algorithms can offer coherence and solutions; conversely, in-person therapy offers the possibility of holding space to sit knee-deep in uncertainty, in relationship, together.”

Where AI is far less effective, maybe, is in its ability to hold contradictions. In my experience, both as a client and as a therapist, therapy and the process of healing is rarely straightforward or linear. Pain arrives in fragments, in silences, in circular stories. It is in those untidy places that meaning emerges, and that is precisely where machines falter. Algorithms can offer coherence and solutions; conversely, in-person therapy offers the possibility of holding space to sit knee-deep in uncertainty, in relationship, together.

This not knowing is not a failure of therapy, but its very essence. To sit with what cannot yet be explained or resolved is often the most faithful response we can offer. It is through holding the tension of “what is” without rushing to premature answers, that new images, insights, and possibilities can emerge. As James Hillman observed, “the uncertainty about what I and the patient are there for, is what we are there for.”

For many, it is the being present in the mess together, without neat answers, which brings relief and a sense of healing. In this way, the work might be in sitting alongside another in the not knowing, without the pull to fix or provide answers.

And perhaps this is the greatest distinction between human therapy and artificial intelligence. Where therapy embraces the unknown as fertile ground, AI seeks to eliminate it. Where therapy values presence within ambiguity, AI offers predictability, coherence, and the comfort of clear answers. And perhaps it is this very predictability, the comfort of clear answers, that can make AI feel so safe.

The Allure and the Limits of Safety

Maybe AI feels safe in part because it never grows tired, impatient, or distracted. For those who have been dismissed or invalidated, that reliability is appealing, and it can genuinely help. And therapy is not simply about safety as comfort; it is also about risking connection.

Allowing oneself to be known by another person (and surviving that exposure) is often where transformation begins.

AI can mirror, it can comfort and even provide companionship; it can challenge and prompt insight; and it can offer a kind of witness that feels meaningful to many. Yet this is not the same as the embodied, ethical, relational witness that comes from sitting alongside another human being, in their humanity. In my psychotherapeutic counselling training, this was expressed through the figure of the wounded healer, or the archetype of Chiron, a reminder that the therapist’s own humanity and vulnerability are part of the healing encounter. AI may have its place, yet it cannot step into that archetypal role; its witnessing arrives by different means and holds different limits.

“ For many, it is the being present in the mess together, without neat answers, which brings relief and a sense of healing. In this way, the work might be in sitting alongside another in the not knowing, without the pull to fix or provide answers.”

I suspect, too, that the value of AI’s consistency may vary across generations. For some younger clients, raised with screens as companions, it might feel natural and reassuring to dialogue in this way. For older clients, the same responses might feel mechanical, a reminder of absence rather than presence. The meaning of AI’s “companionship” is therefore not universal, but deeply contextual.

“ Maybe AI feels safe in part because it never grows tired, impatient, or distracted. For those who have been dismissed or invalidated, that reliability is appealing, and it can genuinely help.”

A Hybrid Future

Covid taught me that what once felt impossible can become indispensable. AI may not be therapy in itself, and yet it can play a supporting role: helping to track patterns, offering prompts between sessions, or providing late-night comfort until morning. For some, it may even be a first bridge into human connection.

Looking back, I see myself as twice humbled: first by Covid, and then by AI. Both moments forced me to rework assumptions about how therapy “should” look. Both taught me to hold certainty lightly and to remain open to what new forms of accompaniment might emerge.

And yet, both experiences also clarified what remains sacred. Counselling is not, at its heart, about strategies, availability, or efficiency. It is about presence: one human accompanying another through the mess and mystery of life.

I do not see this as a contest of which is better, in-person or AI. Both have their use and their place. Both carry their benefits and their limits. Perhaps the forward path is not in choosing one over the other, but in exploring how each might serve the human condition. And in providing a container and a holding that allows for the ethical and grounded offering of both.

Maybe what this moment calls for is a wider holding, not only within the therapy room, but within culture itself. We need spaces where the rapid emergence of these tools can be thought about, questioned, and integrated with care. If therapy can model that balance, between openness and discernment, between embrace and boundary, then it offers not only healing for individuals, but guidance for society navigating its own uncertain future.

If AI offers tools, and therapy offers presence, then perhaps what we need most is the wisdom to weave both with care.

“ Covid taught me that what once felt impossible can become indispensable. AI may not be therapy in itself, and yet it can play a supporting role: helping to track patterns, offering prompts between sessions, or providing late-night comfort until morning. For some, it may even be a first bridge into human connection.”

About the Author
Published Date
Share

Free Access Now