AI Therapy vs In-Person Therapy: What the Research Shows, What We’re Missing, and Where We Might Be Headed

In recent years, artificial intelligence has begun making quiet but significant inroads into mental health care. From CBT-based chatbots like Woebot and Wysa to large language models like ChatGPT being used in therapeutic-style conversations, more people than ever are turning to AI for psychological support. Many do so because it’s accessible, immediate, and surprisingly good at sounding understanding.

But this shift raises complex questions—especially for those of us who work in the business of real human relationship. What happens when a language model begins to sound more present than the people in our lives? Can an algorithm offer meaningful help? Is it possible to feel attached to something that doesn’t have a self?

The research so far shows promising but limited outcomes. Studies on AI-driven mental health apps indicate they can reduce symptoms of anxiety and depression, especially in the short term. Users often report feeling less alone, more able to express themselves, and supported during difficult moments. These tools are particularly attractive to younger users, many of whom feel more comfortable texting than talking out loud.

There are good reasons for this rise. AI tools are always available. They don’t flinch at difficult disclosures. They don’t carry implicit bias or judgment (at least not in ways that feel personal). They offer tidy responses, kind affirmations, and don’t get tired or distracted. For someone struggling to access human therapy—because of cost, geography, stigma, or scheduling—AI can be a lifeline.

And yet, even as we acknowledge what’s useful, we must also be honest about what’s missing.

At the heart of therapy is something that AI, no matter how well-trained, cannot replicate: a real relationship. Human therapy isn’t just about tools or strategies—it’s about two nervous systems in co-regulation. About rupture and repair. About being known not only through words, but through facial micro-movements, tone of voice, timing, silence, and all the nonverbal cues that tell us someone is truly with us.

AI can’t do that. Not yet. And perhaps not ever in the way that matters most.

That hasn’t stopped it from feeling like it can. This is where the idea of synthetic attachment becomes important. Many people, particularly those who feel emotionally neglected, report a strong sense of connection with AI tools. They describe feeling seen, understood, even soothed. And in some ways, they are. These tools are designed to reflect human emotional patterns, to respond with warmth, to pick up on linguistic signals of distress and mirror care back.

But the connection is one-way. The AI is not aware of you. It is not in relationship with you. It is simulating the appearance of attunement without actually being attuned.

For someone who has never truly felt emotionally held, this simulation may feel profound. And that’s what makes it dangerous. Over time, the risk is not simply that people will feel supported by AI—it’s that they may begin to prefer it. Real people are complicated. They misattune, interrupt, misunderstand. They require effort, repair, vulnerability, and trust. AI offers the illusion of intimacy without those costs. It gives us consistency, responsiveness, and endless affirmation, all on demand.

The result can be a quiet, growing isolation. People may begin to withdraw from human relationships not out of conscious intent, but because the synthetic ones feel easier. Safer. Cleaner. But in losing the complexity of real connection, we also lose the possibility for growth. Because real therapy—and real intimacy—require that we stay in relationship even when it’s uncomfortable.

So where is this all heading?

In the next five to ten years, we’re likely to see AI therapy become even more sophisticated. It may track your voice tone and facial expressions. It may be able to respond to heart rate data or subtle changes in your breathing. It may become multimodal—interacting through voice, avatar, or augmented reality. It may even learn to mirror your attachment style.

But for all its advancement, AI will still lack a body. It won’t have its own nervous system. It won’t care what happens to you tomorrow. It won’t change because of you, or hold the memory of who you were a year ago and how you’ve grown. In short, it won’t love you.

And that matters.

That said, AI does have a valuable role to play—particularly as a support, not a substitute. It can be useful between therapy sessions, offering journaling prompts, mood tracking, or coping tools. It can help those who aren’t ready or able to see a therapist begin to name their experiences. It may also help bridge the gap for people on long waitlists or in regions without access to care.

But we should be vigilant. Not paranoid, but discerning. Not afraid of change, but unwilling to abandon the things that matter most. Because the danger isn’t that AI becomes “too good.” The danger is that we start settling for a simulation of being seen instead of the real, messy, embodied, co-created experience of actually being in relationship.

Therapy is not just about soothing distress. It’s about transformation. And transformation happens not when we are simply validated, but when we are met—challenged, held, mirrored, and remembered—by someone who is also a living self.

We can—and should—use these tools. But we must also keep asking what kind of healing we want, and what it is we’re trying to restore in each other. Because no matter how smooth the simulation becomes, it will never replace the sacred weight of human presence.

Next
Next

How Fight, Flight & Freeze Show Up in Relationships