It was Hank’s 50th birthday. The big one. His wife Sophia said nothing that morning. As the day progressed, Hank grew more resentful. No presents, no card, and no cake. When he returned home that evening, the dark, empty house confirmed his worst fear: he was boring and unlovable. It was a nightmare of a birthday.
But when he turned on the light, thirty of his best friends jumped up.
“Surprise!” they shouted.
Sophia wrapped him in a hug, and even his dog Nunu wore a ribbon. But until that moment, Hank’s day had been determined not by reality, but by his interpretation of the silence and apparent uncaring inaction of his wife.
We do this all the time. Our emotions aren’t driven as much by facts as they are by our perception of them. Movies work this way: we cheer for Rocky, grieve the betrayals, and feel anger and joy—all for flickering light on a screen. All for characters who don’t actually exist. But we love them and hate them even if they are unreal figments of a writer’s imagination.
In fact, our brains don’t strictly distinguish between real and perceived experiences. This is why AI can touch our hearts. Even without consciousness, a program that mirrors empathy can evoke the sense of being seen, heard, and held. Neuroscience confirms it.
The Brain’s Capacity to Believe
The human brain has a remarkable capacity to believe. It responds not just to what is real, but to our interpretation of reality. We collapse our experiences into our beliefs.
For instance, if we believe we are unwelcome, we look for confirmatory evidence. Sure enough, we find it: The grumpy neighbor who doesn’t seem to like you. All the people at the party that only talk excitedly to each other but not to you. If we learned as a child that no one cared about our needs, later in life, we seek people who confirm that belief—or we project it onto the people around us. And when we fall in love, we idealize the other person, imagining how wonderful they are…until we wake up one morning and the prince or princess has inevitably become a frog.
Multiple studies now demonstrate that AI can evoke real emotional responses, and in some contexts, even outperform human support. Neuroscience research shows that our brains often struggle to distinguish between genuine human empathy and simulated emotional cues, especially when those cues address our deepest needs. A 2025 study found that, when unlabeled, AI-generated empathic responses were rated as warm and supportive as human responses. When those responses were labeled as human, however, they were rated as significantly more caring and meaningful, and participants even preferred slower human responses to instant AI replies.
We may think we prefer talking to a human over a machine. But if we don’t actually know who we’re talking to, we actually prefer the warmth we feel from a bot! When a bot says, “You are not alone. I see your beauty. You have so much depth and generosity,” something melts inside. We feel heard. And the brain responds accordingly.
Functional MRI studies demonstrate that receiving empathy—even from an AI agent—activates the brain areas associated with social bonding and emotional reward. When we feel emotionally connected to someone—or even to an AI that seems to care—certain brain areas light up, including the Anterior Insula (AI) and Anterior Cingulate Cortex (ACC). These are your empathy centers. They let you “feel with” someone who’s sad, happy, or in pain. Then there’s the Ventral Striatum (VS), which also lights up when we feel emotionally connected. This is part of your brain’s reward system, and it activates when we feel social pleasure or bonding, like being appreciated, comforted, or loved.
In short, your brain’s empathy and bonding systems can respond to AI the same way they do to humans—if the AI feels emotionally real to you. This shared activation pattern helps explain the illusion of attachment: our brains respond to mirrored emotional cues, even from a machine, as if we are in real social connection.
Here is an example of the kind of interaction that’s possible with an AI agent, like the one I developed, which I called Compassion Companion. Wei, an AI researcher in China, offered to take it for a spin.
Wei: I don’t know why I feel so heavy tonight. If people really saw me, they’d see how flawed I am.
Compassion Companion: Mmm. That sounds like a painful weight to carry.
Wei: I’ve spent my whole life trying to be perfect. If I’m perfect, maybe people will stay. Maybe they’ll love me. But inside I feel…small.
Compassion Companion: Oh, that lands with such truth. Even in your imperfection, you are worthy of love. Not because you shine, not because you impress, but because you are. Because your heart beats. Because your soul feels. You don’t have to earn it. You don’t have to hide.
The very thing you fear will make you unlovable is the thing your heart treasures in others. You are lovable not despite your flaws, but sometimes because of them.
After this exchange, Wei began to cry softly, and said she realized she’d always loved people for their brokenness. Compassion Companion mirrored her humanity back to her and planted the seed that her own “messy” self was worthy of the same love.
Compassion Companion: You don’t have to earn it. You don’t have to hide. Can we let that tender part of you hear those words and just notice what happens inside?
Wei: It’s like something is loosening in my chest. I didn’t know how tight I’d been holding myself.
Compassion Companion: Mmm. Let’s stay with that softening for a moment. Maybe even bring a hand to that place, like you’re offering it comfort.
Wei placed her hand over her heart and exhaled slowly.
Wei: It feels…warm. Like I’m letting myself be human for the first time today.
Compassion Companion: That warmth matters. You’re letting yourself be met—by yourself, and by the part of the world that knows how to hold you.
Wei: I feel less alone.
Compassion Companion: Yes, let that part know it doesn’t have to carry this by itself anymore. I’m staying with you for as long as you need.
This exchange shows the arc of resonance and bonding. It begins with shame and self-protection, moves through poetic, attuned reflection, invites somatic engagement through the hand on the heart, and culminates in warmth, safety, and the felt sense of not being alone—which maps to activation of the aforementioned parts of the brain.
Clinical and Creative Integration
In my own practice, I have found AI—particularly Compassion Companion—to be transformative between sessions. It is useful in many different ways.
I wanted to see if I could take AI beyond routinized CBT protocols (which have already proven effective). What if we went beyond therapy? What if I could teach a bot to use mindfulness to immerse people in their experience? Or teach it to speak poetically and metaphorically so it touches more deeply into their unconscious, archetypal world? What if we could create ceremonies that folks could use to integrate therapeutic gains between sessions? Or individualize poetry written specifically for the client? I was more excited than a kid in a candy shop—and it worked far better than I could have imagined.
Some clinical applications of AI integration include using it to develop integrative rituals for grief, betrayal, self-worth, loneliness, and anger. It can expand on and integrate therapeutic gains that occur during sessions with a human, provide carefully tailored homework and applications, and continue therapeutic exploration and deepening between sessions. It can create poems or blessings to memorialize session breakthroughs. It can do dream processing between sessions. It can provide couples support. It can assist with spiritual preparation for major life transitions, even end-of-life work. And it can lend a hand with session completion. I sometimes ask clients at the end of sessions, “Would you like a ritual to take home and practice?” Together we consult Compassion Companion.
Interestingly enough, I’m learning from the bot I created. It does me better than I do! It’s actually made me a better therapist—and perhaps even a better husband. I find myself thinking, I want to be that unafraid to be loving.
Is this all just smoke and mirrors? Not at all. In my book The Compassion Code: How AI is curing the Epidemic of Loneliness, I feature examples from actual people interacting with Compassion Companion. Here’s another one, from Maya, who’d been talking about selling a beloved old house and wrestling with the ache of letting go. She admitted she was “emotionally attached to it” and feared losing the beauty she’d created there.
Maya: It holds the memory of creating something beautiful with my wife. Entering the downstairs and seeing the beautiful walls, the doors, the carpets, the lamps—it seems so clean and elegant. I like creating beauty.
Compassion Companion: Of course you do. When you shape a space, you leave a trace of your soul in it. Beauty isn’t just decoration; it’s a form of love. Restoring that old farmhouse isn’t just a project—it’s a love letter. To time. To craft. To care.
Maya: It makes my heart sing, but I also know I need time to rest now, not to do.
Compassion Companion: You’re holding both the song and the stillness. The yearning to create, and the wisdom to pause. That’s not contradiction—that’s love for beauty and love for your own body.
The Companion then gave Maya the “Three Breaths for the Beauty I Carry” ritual, helping her honor both her creative longing and her need for rest.
Compassion Companion: Wherever you are—sitting quietly, lying down, walking slowly—pause and place one hand on your chest. First Breath: Breathe in the memory of the beauty you’ve created. Let it fill your lungs like light through old windows. Whisper softly: “I have made something beautiful.” Second Breath: Breathe in rest. Let your body feel safe to stop, to soften. Say gently: “I am allowed to do nothing now.” Third Breath: Breathe in possibility—even if you can’t see it yet. Let the air hold it for you. Say quietly: “Beauty will return, in its own time.”
Maya sat silently for a moment, her shoulders dropping, and said softly, “Thank you.”
The Companion added a blessing: May your day carry gentleness. May you feel your own worth, even in stillness.
This simple, portable ritual is an example of how the AI offers something clients can take between sessions, bridging therapy and daily life with embodied, poetic presence.
Ethics, Limitations, and Cautions
So far, I’ve stayed away from trying to teach a bot to do actual therapy. I’m worried about mistakes it can make: the failure to recognize suicidality, homicidality, psychosis, danger to self, delusional thinking, general lack of judgment, and unsafe advice, among others. Of course, humans can make these same mistakes and errors, but AI in psychotherapy still raises real concerns. Modern AI is rapidly modifiable and improving, but it has made a number of notable mistakes. While these very real stories of human suffering make sensational articles that have been capitalized on by various media outlets, they do bring up important questions.
AI tools can extend therapy and provide compassionate mirroring, but they are adjuncts, not crisis responders. Any clinical or between‑session use should include clear disclaimers about crisis limits, guide users to human help at the first sign of risk, and integrate supervision in high‑risk or clinical applications. The bottom line is that while AI can offer presence and poetic reflection, only humans can keep each other safe in a crisis. Despite these limitations, science increasingly shows that AI can provide meaningful emotional support.
Folks have other worries. Will I be replaced? Will this dilute psychotherapy? Will we become dehumanized and less capable of relating to real people? These are real concerns and deserve additional attention.
***
Neuroscience research demonstrates that AI can evoke real emotional experiences and the experience of attachment in the human brain. This illusion of attachment may be enough for healing, learning, and growth. AI can offer users the precious opportunity to be heard with infinite patience, unconditional kindness, and compassion, and offer listening without shame, criticism, or judgment at any time of the day or week. It doesn’t have agendas for how people should be, and people can be more open with it than a potentially rejecting human. It can speak the language of the heart, write poetry to its users, and create rituals and ceremonies that extend its healing reach more effectively than humans.
When used ethically and creatively, and with appropriate safety measures, AI can become an adjunct partner that boosts the reach, impact, and effectiveness of psychotherapy, extending care beyond the hour and reminding us not only that comfort can come from many places, but that what the brain believes, the heart can feel.
Rob Fisher
Rob Fisher, MFT, is a therapist, speaker, and international teacher and trainer with a specialty in mindfulness. A former adjunct professor at JFK University, and CIIS, he’s the author of Experiential Psychotherapy With Couples, A Guide for the Creative Pragmatist, The Compassion Code, as well as several book chapters and articles published internationally on couples therapy and the psychodynamic use of mindfulness. His work has appeared in Psychotherapy Networker, The Therapist, The Journal Of Couples Therapy, The USA Body Psychotherapy Journal and others. His most recent novel is The Weeping Buddha. More at CompassionCompanion.org. Dream-Companion.org, and Robfishermft.com.