Each year we open submissions for our Annual Wise Therapy Spotlight to explore questions of vital importance to our therapist community. We are consistently moved by the depth and generosity of these unedited community voices.
For this 6th edition, we asked: How do we remain faithfully human in an increasingly automated world? Read more about our inspiration in the letter from the editors and Academy of Therapy Wisdom co-founders, Brian Spielmann and Ian McPherson.
Download Now: Wise Therapy Spotlight December 2025 Issue
We hope you enjoy the reflections of Nina Otazo as much as we all did.
Therapy Wisdom Spotlight: Nina Otazo M.A. M.A. (Clinical Psychology from Antioch + 2,000+ hrs. of supervised hours)
As therapists, we are trained to sit in the dark. We accompany people into the underworld of their stories, into the places where language breaks down and where the psyche speaks through symbol, symptom, and silence. We listen for what is beneath the narrative. We slow down where others speed up. We welcome the shadow not because it is comfortable, but because we know it is necessary. The work of psychotherapy is, at its core, a disciplined, relational form of remembering – bringing back into awareness what has been exiled, disavowed, fragmented, or forgotten.
But as artificial intelligence becomes increasingly integrated into daily life, including in the realm of mental health, we find ourselves confronted with a curious cultural contradiction. We, as a society, eagerly consume the benefits of AI: its convenience, its speed, its predictive efficiency, its endless availability – while remaining almost willfully blind to its hidden costs: the human labor behind the training of these systems, often performed under exploitative conditions in the Global South; the environmental impact of data centers on water and energy resources; the concentration of power and data in the hands of a few corporations; and the quiet but real psychological shifts that occur when humans form attachments to systems that mimic empathy without possessing subjectivity.
If therapists are the ones who practice pulling back the curtain, then we cannot ignore these shadows. Our credibility in helping clients navigate a world increasingly shaped by algorithmic influence depends on our willingness to take an unflinching look at the broader ecosystem of AI – its gifts, its wounds, and its blind spots.
“ AI risks reinforcing a collective fantasy: that empathy can be automated, that connection can be optimized, that emotional labor can be outsourced without consequence.”
AI as Mirror and Mask
AI presents an unusual psychological object: part mirror, part mask. It reflects familiar patterns of language, emotion, and attention, yet its reflections lack lived experience. It can imitate the texture of attunement but cannot metabolize another’s pain. In this sense, AI risks reinforcing a collective fantasy: that empathy can be automated, that connection can be optimized, that emotional labor can be outsourced without consequence.
For clients accustomed to relational wounds, the presence of a tool that never misattunes may appear initially soothing. It never interrupts. It never grows impatient. It never gets overwhelmed or triggered. It never looks away. But it also does not see. It does not risk itself. It does not feel the gravity of the room or the shame thickening in a client’s throat. It cannot hold paradox or bear witness in the embodied, co-regulatory way that human presence can.
This does not mean AI is without therapeutic use. AI can help with psychoeducation, skills training, crisis mapping, or augmenting access to care in communities with limited resources. But we must remain clear-eyed: when AI is used in a therapeutic context, it is not doing therapy. We are still the ones doing therapy – designing the frame, interpreting the meaning, and holding the relationship. AI can be a tool of reflection, but never a container of transformation.
The Hidden Labor: Who Carries the Weight of “Intelligence”?
One of the central ethical shadows surrounding AI is the human labor behind its creation. Many large language models rely on data labeling, content moderation, and annotation work performed by low-wage workers – often in Kenya, the Philippines, or India – who are exposed to graphic content for hours a day in order to filter and train systems to communicate “safely.”
If we, in our clinical work, devote ourselves to understanding trauma and mitigating suffering, then we cannot ignore the trauma absorbed by these invisible workers. Their wellbeing is woven into the very fabric of the tools we are beginning to use. Their emotional exposure – and the lack of adequate protective measures, wages, or psychological support – mirrors the very systems of exploitation and dissociation that therapy aims to repair.
“ What does it mean for us to use AI thoughtfully if the cost of its “calm” responses is the psychological wear of unseen humans?”
To practice integrity, therapists must hold this complexity: the tool we use may have been shaped by global inequities that replicate trauma on a systemic scale. We are called, therefore, to ask not only “What can AI do?” but also “Who is paying the price for what it can do?” and “How do we integrate that awareness into ethical practice?”
The Ecological Shadow: Water, Energy, and the Psychology of Extraction
Another hidden dimension of AI is environmental impact. Data centers require vast amounts of electricity and water to cool servers. In regions already experiencing drought or resource scarcity, this becomes not just a technological cost, but a human and ecological one.
Therapy often teaches clients to recognize when they are living beyond their internal resources – to identify burnout, depletion, and patterns of overextension. Yet we rarely apply this same framing to the technologies we rely on. AI’s rapid expansion reflects a collective difficulty in setting limits, a cultural refusal to acknowledge the costs of constant acceleration.
If we are to model grounded, sustainable presence for our clients, then we must also model a grounded, sustainable relationship to technology – one that considers longterm impact, not just short-term convenience.
AI as a Projection Screen: Our Fears and Fantasies
Psychologically, AI functions as a potent projection object. As clinicians, we know that projection often intensifies around ambiguous or partially known entities. AI’s inner workings are opaque to most people, and what is unfamiliar becomes fertile ground for fantasy – utopian or dystopian.
Clients might project idealized safety (“It understands me better than anyone”) or catastrophic fear (“It will replace humanity”). Both are understandable. Both are partial truths. And both require containment.
Therapists must be prepared to help clients navigate these fantasies – not by dismissing them, but by exploring what they reveal. A client who trusts AI more than humans may be expressing a history of relational betrayal. A client who fears AI dominance may be struggling with feelings of helplessness in a rapidly changing world. A client who assigns moral authority to algorithmic feedback may be echoing childhood dynamics with all-powerful evaluators.
To work with these projections, we must first understand them in ourselves. We must examine our own hopes, anxieties, and countertransference toward AI:
Do we fear replacement?
Do we secretly wish for the perfect, tireless assistant?
Do we feel intimidated by technological complexity?
Do we rely on AI to soothe our own overwhelm?
An unexamined internal landscape can distort our clinical judgment. To stay grounded, we must become aware of the ways we, too, project onto the machine.
What Does It Mean to Be Human in an AI Era?
The rise of AI challenges the very foundations of psychotherapy: presence, relationality, meaning-making, narrative, embodiment, and vulnerability. Not because AI can replace these things, but because AI’s existence pushes us to clarify, with unprecedented precision, what we mean by them.
If AI can imitate empathic language, then what is empathy beyond language? If AI can remember a client’s preferences, what is memory beyond stored data? If AI can generate insights, what is insight beyond pattern recognition? If AI can answer instantly, what is the value of the slow unfolding we cultivate in therapy?
AI forces us to articulate – and to protect – the aspects of the therapeutic encounter that are fundamentally human: the risk, the reciprocal change, the co-regulation, the shared silence, the embodied presence that no machine can replicate.
Our Ethical Imperative: Bringing the Shadow Into the Light
As therapists, we have a responsibility that extends beyond the consulting room. We are, by training and vocation, the stewards of the unseen. If our cultural moment is rushing headlong into AI adoption without pausing to acknowledge its shadows, then our role becomes even more crucial.
We must ask uncomfortable questions – not to obstruct progress, but to ensure that progress remains human-centered, ethical, and compassionate.
We must think critically about our relationships with technology.
We must advocate for transparency around labor and environmental impact.
We must challenge narratives that frame AI as a replacement for human connection.
“ The rise of AI challenges the very foundations of psychotherapy: presence, relationality, meaning-making, narrative, embodiment, and vulnerability. Not because AI can replace these things, but because AI’s existence pushes us to clarify, with unprecedented precision, what we mean by them.”
We must cultivate an attitude of humility – recognizing both the potential and the limitations of these tools. And above all, we must model the very thing we ask of our clients: the courage to face what is hidden or made invisible.
Conclusion: Toward a More Conscious Use of AI
AI will not disappear. It will continue to evolve and integrate into the texture of human life, including mental health care. But its trajectory is not predetermined. We have agency in shaping how it is used, how its costs are mitigated, and how its benefits are understood.
If therapists are to remain relevant – and, more importantly, responsible – we must be willing to bring the same curiosity, discernment, and shadow work to the rise of AI that we bring to every human psyche we meet. Our work has always been to illuminate what has been forgotten, to give language to what has been silenced, and to accompany others through the dark toward deeper truth.
The world needs that perspective now more than ever.
What you´ll learn:
- Vestibular Engagement for Emotional Regulation
- Using the Eyes to Hack the Stress Response System
- Subtle Sounds to Release the Peri-Trauma Response
- Effective Self-Holding and Self-Swaddling Techniques
- How and When to Apply Bilateral Stimulation
- Integration and Completing the Stress Response Cycle



