Each year we open submissions for our Annual Wise Therapy Spotlight to explore questions of vital importance to our therapist community. We are consistently moved by the depth and generosity of these unedited community voices.
For this 6th edition, we asked: How do we remain faithfully human in an increasingly automated world? Read more about our inspiration in the letter from the editors and Academy of Therapy Wisdom co-founders, Brian Spielmann and Ian McPherson.
Download Now: Wise Therapy Spotlight December 2025 Issue
We hope you enjoy the reflections of Sadaf Arafati as much as we all did.
Therapy Wisdom Spotlight: Sadaf Arafati UKCP Registered Integrative Psychotherapeutic Counsellor
The arrival of AI tools in mental health has unsettled many therapists. The insistence that “AI can never replace therapy” echoes through professional spaces with a mix of defensiveness and genuine concern. Yet in clinical work, I find that the question is less about replacement and more about recognition. AI is pressing against the edges of our practice, asking us to look at what has long remained unacknowledged: the vast emotional loneliness people live with, the failures of our systems, and the desperate desire for a space where feeling does not threaten relationship.
A Landscape of Insufficient Holding
In countries like India, where the public mental health infrastructure is chronically underfunded, the therapeutic encounter is not guaranteed. Many people move through their lives without ever encountering a reliable psychological container. The use of AI emerges within this landscape. It is not a technological fad. It is a sign that the inner world is searching for a form of holding—structure, rhythm, a sense of presence—where human support has been scarce or inconsistent. Clients describe turning to AI late at night, when their thinking becomes fragmented, or when shame makes human contact feel unbearable. What they reach for is not a “machine therapist,” but a place where their experience can take shape. The tool becomes a temporary surface for symbolisation. Much like a transitional object, it steadies the movement between what feels overwhelming inside and what might one day be spoken to another person.
“Many people move through their lives without ever encountering a reliable psychological container. The use of AI emerges within this landscape. It is not a technological fad. It is a sign that the inner world is searching for a form of holding—structure, rhythm, a sense of presence—where human support has been scarce or inconsistent.”
A Transitional Space in the Digital Realm
Alessandra Lemma’s writing on grief technology has been important to me in making sense of this. She draws attention to the ways in which technological objects mediate our relationship to loss. Her questions helped me see AI as something that can participate in the early phases of emotional processing, long before a patient arrives in the clinic. I have seen clients use AI to name affect, organise thought, and approach feelings they had previously disavowed. When the mind is dysregulated, even a consistent response can create enough internal quiet to think. This is not therapy, but it may prepare the psyche for it. The AI becomes a symbolic third: a presence that assists the mind in forming links, without entering into a fully relational field. The risk, of course, is fixation. A transitional object becomes useful when it allows movement. When the object is held too tightly, development pauses. AI can offer structure but cannot offer the psychic depth that comes from encountering the mind of another person. It does not dream, desire, misunderstand, or conflict. Without those qualities, it cannot support the deeper reorganisations that come through rupture and repair.
What AI Holds, and What It Cannot Contain
Inside the analytic frame, I have been curious about how AI might extend the patient’s reflective capacity between sessions. Tools that help track emotional patterns or articulate preconscious material can, in some cases, enhance the work. They offer a surface for projection without destabilising the frame itself. Outside the room, during moments of isolation or psychic fragmentation, clients often use AI to regain a sense of orientation. The continuous availability of the tool acts like an auxiliary ego. It reminds them that they have a thinking function, even when it feels momentarily lost. In these moments, AI provides containment without intrusion. Yet containment is not enough for transformation. Therapy relies on something far more unpredictable: the meeting of two psyches. A therapist’s countertransference, their boredom, their irritation, their tenderness—all of these form the texture of the analytic encounter. AI cannot enter this field. It cannot surprise the patient, and it cannot survive the patient’s aggression. This limits its capacity to deepen emotional life, even as it supports the early stages of reflection.
“I have seen clients use AI to name affect, organise thought, and approach feelings they had previously disavowed. When the mind is dysregulated, even a consistent response can create enough internal quiet to think. This is not therapy, but it may prepare the psyche for it. The AI becomes a symbolic third: a presence that assists the mind in forming links, without entering into a fully relational field.”
Cultural Relational Anxiety and the Turn Toward AI
When clients express comfort with AI, I hear a familiar cultural anxiety. In many communities, the expression of vulnerability is met with judgement, moralising, or premature advice. Individuals often grow up without models for emotional attunement. In adulthood, the prospect of speaking directly to another person can evoke fear, shame, or anticipatory defeat. With an unregulated mental health industry, people are often right in their fears of being shamed or being given the wrong suggestions by the therapist. AI offers a space where these anxieties are temporarily suspended. The user feels neither evaluated nor engulfed. Rather than dismiss these interactions, I see them as traces of psychic longing. The person is rehearsing the act of address. They are testing the possibility that their interiority can be spoken aloud without rupture. AI becomes the first room. Therapy may become the second. The work then shifts from relying on a nonjudgmental digital surface to tolerating the risks of a real relationship. AI can support this progression if we understand what function it is serving—protection, organisation, or simple companionship—and if we help the patient move toward greater relational contact. It is also the job of the field – both in India and globally, to gain enough trust and raise enough awareness so that people feel more motivated to access us as well as know how to use AI intelligently.
AI and the Therapist’s Mind
AI is also influencing clinicians. Many therapists use it quietly as a way to metabolise complexity, often alongside experts supervision from a senior analyst. A difficult session, an overwhelming transcript, or an unformulated countertransference can feel easier to approach when the material has been distilled. AI can act as a thinking partner because its structuring capacity frees us to re-enter reverie. It helps clear the noise, allowing the analytic imagination to open again. This too reveals something. Even trained clinicians seek containers when their minds feel saturated. AI, in its limited way, offers a form of holding for the therapist’s own thinking. The difference is that we do not confuse this for relationship. We use it to return to the patient with a clearer internal space.
The Necessity of Ethical Containment
The ethical challenges remain significant. AI is often built by people who are not oriented toward psychological development, and who may prioritise efficiency over emotional reality. Systems optimised for user comfort can inadvertently reinforce avoidance, by validating the user endlessly and creating echo chambers which risk increasing psychosis or suicidal behaviour.
“AI can act as a thinking partner because its structuring capacity frees us to re-enter reverie. It helps clear the noise, allowing the analytic imagination to open again.”
Interfaces designed for retention can encourage compulsive use rather than reflection. In analytic work, the frame holds the psyche. In technological spaces, there is rarely such a frame. Without boundaries, without a clear purpose, AI risks presenting itself as a form of containment while bypassing the very processes—mourning, conflict, repair—that lead to growth. Our task as therapists is to create an interpretive frame around the technology, so that clients are not left alone with an object that feels responsive but cannot think with them.
What AI Is Showing Us
AI is not revealing the end of therapy. It is revealing the conditions under which therapy has been difficult to access: chronic loneliness, cultural shame, systemic neglect, the absence of psychologically safe relationships. Many people feel unprepared for direct human contact. They need an intermediate space. AI has stepped into that role— sometimes maladaptively, sometimes meaningfully. The rise of these tools invites us to consider something deeper: how starved many minds are for recognition, and how sparingly this need is met in the culture. How quickly they will attach to anything that feels even slightly containing. And how urgently we need to build systems of care that make human holding more possible.
A Future Held with More Symbolic
Thought I do not imagine a future where AI replaces the analytic encounter. The unconscious demands another unconscious. But I do imagine a future where AI becomes part of the wider ecology that supports psychic development—when used with restraint, curiosity, and ethical clarity. If we treat these tools as companions rather than competitors, they may help us restore something that the modern world has eroded: the capacity to pause, to reflect, to approach our inner life with less fear. And eventually, to turn toward one another with more readiness for encounter. I think the time is right for us to think of the ethical use of AI for mental health, with guardrails around age, duration and referral mechanisms instead of banning or shaming the use, because that might not help the cause. It will only alienate the users of these services further into bubbles where they cling to these apps even more strongly.
“Our task as therapists is to create an interpretive frame around the technology, so that clients are not left alone with an object that feels responsive but cannot think with them.”
What you´ll learn:
- Vestibular Engagement for Emotional Regulation
- Using the Eyes to Hack the Stress Response System
- Subtle Sounds to Release the Peri-Trauma Response
- Effective Self-Holding and Self-Swaddling Techniques
- How and When to Apply Bilateral Stimulation
- Integration and Completing the Stress Response Cycle



