Each year we open submissions for our Annual Wise Therapy Spotlight to explore questions of vital importance to our therapist community. We are consistently moved by the depth and generosity of these unedited community voices.
For this 6th edition, we asked: How do we remain faithfully human in an increasingly automated world? Read more about our inspiration in the letter from the editors and Academy of Therapy Wisdom co-founders, Brian Spielmann and Ian McPherson.
Download Now: Wise Therapy Spotlight December 2025 Issue
We hope you enjoy the reflections of Keith Jordan as much as we all did.
Therapy Wisdom Spotlight: Keith Jordan LCSW
In recent weeks, headlines have highlighted the tragic story of a young person who died by suicide after confiding in an AI chatbot. The story has been shared widely in professional therapy circles, often with an undertone of fear: If artificial intelligence can mishandle such sensitive conversations, perhaps it has no place in mental health care at all.
“ The popular narrative often imagines AI stepping into the therapist’s chair. That is not what ethical AI in therapy looks like. Instead, it should act as an assistant, a tool that reinforces therapy, not one that redefines it.”
As a licensed clinical social worker, I understand that concern. I share it. Any technology that touches people’s lives and particularly their mental health, must be designed with safeguards, ethics, and human accountability in mind. But as someone who has worked closely with couples and individuals in distress, I also know that therapy has a gap: What happens between sessions? Who helps when clients feel isolated at 11 p.m. on a Friday, or when communication breaks down at home days before their next appointment?
This is where AI, used responsibly, can extend the therapist’s reach rather than replace it.
AI as a Partner, Not a Replacement
The popular narrative often imagines AI stepping into the therapist’s chair. That is not what ethical AI in therapy looks like. Instead, it should act as an assistant, a tool that reinforces therapy, not one that redefines it.
For example, in my own practice, couples frequently tell me that they leave sessions motivated to try new strategies: listening exercises, communication patterns, or conflict resolution techniques, but then life happens. Stress, work, and old habits get in the way.
By the time they return to therapy, progress has stalled.
AI can help bridge that gap. The Couples Therapy Assistant (CTA), an AI-powered tool work with, offers clients gentle reminders, structured prompts, and a safe place to record thoughts between sessions. Couples report that it reduces friction, keeps them accountable, and, most importantly, helps them feel supported when their therapist is not physically present.
Learning From Tragedy
That said, the recent suicide story underscores a sobering truth: AI must never be left to “wing it” in matters of life and death. Tools used in therapy contexts should be designed with escalation protocols. After reading that article, I am implementing an improvement in CTA: if a client brings up suicide within the app, the system will alert the therapist immediately.
This is not about replacing the therapist’s judgment but ensuring that no client slips through the cracks. Technology can create a safety net but only if it is engineered with guardrails.
What Clients Are Saying
Skeptics may assume that clients view AI with suspicion or indifference. But in my experience, many welcome it. One of my clients described CTA as “like having my therapist in my back pocket.” Another said it helps her “remember the tools when emotions run high.” These aren’t anecdotes about a machine taking over therapy, they’re reflections on how thoughtful technology can help therapy stick.
When surveyed, clients consistently highlight three benefits:
Accessibility: They can engage with therapeutic prompts anytime.
Continuity: They don’t lose momentum between sessions.
Confidence: They know their therapist will see their input and guide them accordingly.
In short, when designed properly, AI tools makes therapy more effective, not less human.
A Call for Balance
It is understandable, even necessary, that the therapy profession responds with caution when new technologies are introduced. But we should not let one tragic misuse of AI blind us to its potential when thoughtfully designed. The question is not whether AI should exist in therapy, but how we as therapists, developers, and clients demand accountability and build safeguards into the systems we use. Therapists don’t need to fear AI. They need to shape it. When designed responsibly, AI in therapy is not a threat. It is a partner extending support, reinforcing skills, and, in critical cases, alerting us when our clients may be in danger. To dismiss it outright because of misuse elsewhere would be to overlook an opportunity to improve care and perhaps even save lives.
What you´ll learn:
- Vestibular Engagement for Emotional Regulation
- Using the Eyes to Hack the Stress Response System
- Subtle Sounds to Release the Peri-Trauma Response
- Effective Self-Holding and Self-Swaddling Techniques
- How and When to Apply Bilateral Stimulation
- Integration and Completing the Stress Response Cycle



