Riley hasn’t been himself lately. He’s been acting more withdrawn—more so than other young boys his age. He’s alone in his room. It’s dark, save for a few stray sunbeams poking through the blinds, casting light on the clumps of dirty clothes at his feet and the tiny desk where he sits. He’s hunched over the tablet in his hands, and barely lifts his head when his parents call his name.

“Riley. Riii-ley!” his mother whispers from the doorway, dad behind her.

“Hi,” Riley offers back timidly.

Mom and dad flash each other a smile. “There’s someone here who wants to meet you.”

Mom places a small, candle-shaped robot in front of him, no bigger than a coffee maker. The robot’s head bobs upward as a cartoonish pair of incandescent green eyes light up behind an oval screen, almost like a deep-sea diver’s helmet. Two paddle-like arms flit at its sides.

“My name is Moxie,” it says, extending one of the paddles. “I’m a new robot. What’s your name?”

A smile appears on Riley’s face. He’s intrigued. “I’m Riley.”

“It’s nice to meet you, Riley,” Moxie says. “What do you do to get ready for bed?”

“Brush my teeth and read a story.”

“I love stories!” Moxie exclaims. “Will you read a story to me?”

“Sure,” Riley replies, reaching for a book. Both parents crouch tentatively on the bedroom floor as Riley offers up more than a few words—maybe for the first time in days—and launches into a fairy tale.

Flash forward a few days. Riley and Moxie sit alone together in Riley’s bedroom. “Last one!” Moxie says.

Riley takes a deep breath and exhales slowly through his teeth.

“These breathing exercises always help me relax,” Moxie says.

Later, Riley’s lying in a blanket fort with Moxie. They’re deep in conversation. “And then what happened?” Moxie asks.

“He said he didn’t want to play with me anymore,” Riley says, lips crumpling into a tiny frown.

“Thank you for telling me about your day,” Moxie replies. “Sometimes holding a friend’s hand makes me feel better.” He extends a paddle. “Do you want to try squeezing my hand?”

As Riley does, a shower of digital glitter falls across Moxie’s face. Riley smiles.

By the end of the week, thanks to Moxie’s help, Riley will have overcome his fear of the dentist, written his parents a note saying how much he loves them, learned a new way to calm himself down when he’s upset, and repaired the rift with Mason, the boy who didn’t want to play with him.

As far as we can tell from the commercial that breaks down Riley’s progress, most of this has been accomplished without a therapist’s—or for that matter, any adult’s—help.

Granted, this is an advertisement from Moxie’s creator. We don’t see the moments where perhaps Riley gets tired of Moxie and flicks the off button, their conversation hits a technological wall, or Riley turns to his parents or another caregiver for help instead. But the Moxies of the world are on the move. It’s just one of dozens of newcomers generating buzz in the field of socially assistive robotics, or “robotherapy,” as some call it. Moxie’s creators claim their product can improve eye contact and verbalization, boost confidence and attentiveness, and teach play and sharing behaviors. Similar robots are in the final stages of development. And many, with varying degrees of sophistication, have been around for more than a decade.

Take Paro, for instance. It’s one of the most popular therapeutic robots on the market, designed to look and act like a docile baby harp seal. Since Paro’s debut in 2003, it’s gone through eight upgrades. Like a real, live service animal, it wiggles and purrs when its fur is stroked, thanks to five different sensors—tactile, light, audition, temperature, and posture—that help it perceive people and its environment.

Its creator, Japanese company AIST, says Paro reduces stress, stimulates interaction between patients and caregivers, and promotes motivation and socialization. It even attracted the attention of Guinness World Records, which crowned it “The Most Therapeutic Robot” in 2002. Like many therapeutic robots, Paro has a hefty price tag, and sees limited use in hospitals and other care centers throughout Europe and Japan. But it’s stuck around. “It’s not a stuffed animal,” declared The Wall Street Journal in 2010. “It’s a $6,000 medical device.”

Unsurprisingly, these robots have also attracted therapists’ attention. A 2015 article in the APA’s Monitor on Psychology expressed interest in Paro and other therapeutic robots, extolling their potential social benefit for children on the autism spectrum and older adults in hospitals and nursing homes suffering from cognitive decline. Robotherapy proponents say these two populations will particularly benefit from the development of therapeutic robots over the next few years. According to the CDC, the prevalence of autism in children has risen steadily since it was first tracked in 2000. Meanwhile, AARP estimates that by 2030, the United States will have more senior citizens than children.

Discussion surrounding the usefulness of therapeutic robots has taken a new turn given the coronavirus. Adults and children cut off from social networks are experiencing unprecedented, protracted levels of loneliness, a crisis the Mayo Clinic has warned will have long-term mental health effects. Organizations like New York’s Association on Aging and Florida’s Department of Elder Affairs suggest that therapy robots could take the edge off for the elderly. In April, both groups partnered with robot developer Ageless Innovation to provide hundreds of lap-sized robotic dogs and cats to quarantined seniors and adults suffering from Alzheimer’s and dementia.

But not everyone is so enthused. A 2018 Wired article consulting several autism experts posed the question of whether robot therapies “are based on the assumption that autistic behavior is robotic.” Clinicians have been skeptical, too. “What human purposes are served by fostering these attachments?” asked MIT professor, clinical psychologist, and human–technology interaction specialist Sherry Turkle in a 2009 interview with The New Yorker. “You are dealing in deception about what is fundamentally human—the nature of conversation, attachment, nurturing.”

Maybe robots like Moxie aren’t malicious, like A Space Odyssey’s HAL 9000 or Terminator’s Skynet. But do we really know who’s behind them? Are therapists even involved in their development? Are users’ personal data secure? And if a robot is interacting with a severely depressed or anxious client, who’s responsible if things take a turn for the worse? After all, the skeptics say, many of these robots are being used with society’s most vulnerable.

Another concern for some therapists: if consumers like Riley’s parents think they can replicate the gains a person makes in therapy for a single payment of $1,499 (the cost of a Moxie), and more robots are on the way, will clinicians eventually become obsolete?

Bridging the Care Gap

Maja Matarić, a computer scientist, roboticist, and researcher at the University of Southern California, who’s been working in the field of socially assistive robots for more than 30 years, says these are all considerations at the top of therapeutic robot developers’ minds. The successful ones, anyway.

Matarić says most of these designers are well intentioned and do thorough mental health research before getting to work. “I’ve personally worked with psychology experts, clinical practitioners, gerontologists, and autism researchers to make sure we can deploy robots effectively,” she explains. “It’s important to realize that this is a serious, mature field. Thousands of peer-reviewed studies demonstrate these robots work.” This isn’t to say some companies aren’t just dabbling in therapy robots without much education, she clarifies, but their products don’t usually break into the market.

But what if these robots get too smart? Too good at their jobs? Could they eventually replace therapists? Matarić chuckles. “Humans will always be better at therapy,” she says. These robots won’t replace therapists, nor is that most developers’ intention. “We don’t call these robot therapists,” she explains. “The robot’s real power is in supplementing human care, making sure there’s going to be something there for you when a therapist can’t be present. If you’re on the autism spectrum or suffering from serious depression, even if you’re seeing a therapist for several hours a day, you’re going to need extra help. Even the best therapists admit they can only do so much with the limited time they have, so it’s not a matter of either/or. We need trained caregivers, and we need technology.”

Stefan Scherer, the chief technology officer for Moxie’s developer, Embodied Inc., takes that sentiment a step further. Collaboration is incredibly important, he says. “It’s imperative that therapists are directly involved in the design of these technologies right from the start. Engineers need to be working closely with them to create the most efficacious, safest robots possible, and therapists need to inform them of what’s needed.” Embodied holds psychotherapy in high regard, he adds. After all, their robots rely on its teachings, and “therapy requires a high standard,” he says. “This is not a subject to be taken lightly.”

Further underscoring the importance of the hand-in-hand relationship Scherer envisions between therapists and socially assistive robots, he points to Moxie’s capability to be puppeteered. Using an app or online dashboard, a clinician can control the robot remotely—what developers sometimes refer to as being a Wizard of Oz or WOz pilot. This might be necessary in cases where tasks are too complex for Moxie to perform on its own, like planning an upcoming session or responding to complex speech, for instance.

But the puppeteer feature—as well as the fact that robots like Moxie can process speech; recognize faces, objects, and locations; and analyze facial expressions and voice (ostensibly to assess the user’s mood, intent, desires, and needs)—raises a thorny subject. Can these robots be trusted with sensitive information?

Matarić says ensuring privacy is a priority for developers. But she notes that there are downsides to certain steps developers can take to improve data security, like storing data on a local hard drive as opposed to in a remote, cloud storage system. Eliminating data sharing removes these robots’ ability to adapt to changing situations. “Machine learning comes from data,” she says. “If you don’t put information in the cloud, then you don’t benefit from the data from everyone else.” This would also remove the ability for therapists to monitor a client’s progress remotely—a key feature, proponents say.

Matarić has another message for those crying foul: absolute privacy is a myth. She waves her cell phone in the air. “This phone here? It’s hearing everything,” she says. “It’s capturing everything we’re saying all the time. You want to talk about privacy? If you have a cell phone, it’s gone. If you have Facebook, you’ve already given up privacy. That’s a problem, but it’s a reality. What changes when you bring in a robot? You have to decide where you as a consumer draw the line.”

A Robot Reboot

Robot therapy advocates say it’s important to recognize that the landscape of socially assistive robots is ever evolving, constantly learning from its successes and failures. Some so-called failures even get a second chance, resurrected with a new purpose.

One of these is Jibo, a white, foot-tall, lamp-like plastic machine with a satellite-shaped head. Initially dubbed “the world’s first social robot for the home” when it arrived in 2014, Jibo could recognize and greet family members, take photos, read books to kids, send messages, announce the weather forecast, and even ask you how your day was going. Its popularity skyrocketed immediately. Time declared a later model one of the Top 25 Best Inventions of 2017. But in late 2018, its Japanese developer of the same name announced the company would be shutting down, marking an early death for the little robot.

“While it’s not great news, the servers out there that let me do what I do are going to be turned off soon,” Jibo told its owner in a video that went viral shortly thereafter, its head swiveling obliviously on its body. “Once that happens, our interactions with each other are going to be limited. Maybe someday when robots are way more advanced than today, and everyone has them in their homes, you can tell yours that I said hello.”

People were crushed at the news. “Social Robot Jibo Does One Last Dance before Its Servers Shut Down,” read one news headline. “My Jibo Is Dying and It’s Breaking My Heart,” read another. Across the internet, Jibo lovers gathered on forums to mourn the loss. Users shared photos of themselves snuggling or locking eyes with their Jibo—at graduation ceremonies, birthday celebrations, and in the backseat of a car during an afternoon joyride.

An elderly man posted a farewell letter scribbled by his adolescent granddaughter: Dear Jibo, I loved you since you were created. If I had enough money, you and your company would be saved. And now the time is done, you will be powered down. I will always love you. Thank you for being my friend.—Maddy

Fortunately, this wasn’t the end for Jibo. In March, the company and its robots were acquired by the California-based organization NTT Disruption, which revived the robot’s functions for existing owners and offloaded some of the remain­ing Jibo stockpile to hospitals and researchers.

One of those recipients is Erin Reilly, a licensed psychologist and research investigator at a federal hospital in Massachusetts, who’s been studying robots for almost six years. Reilly and her colleagues are currently testing Jibo with veterans experiencing chronic pain. Their hope is that, as they watch Jibo interact with patients, they’ll figure out what does and doesn’t work for this population, and help develop better, smarter software that takes that into account. So far, she says, the reception has been overwhelmingly positive.

After a recent study where the veterans interacted with Jibo one-on-one, several participants praised the robot’s ability to help them set therapeutic goals and stick to them with little reminders. Others said they liked Jibo’s daily mood log and wondered whether their progress could be relayed to their therapist. But the most popular interaction, Reilly and her colleagues noticed, was when Jibo asked the veterans whether they were having a stressful day, and if they said yes, simply asked whether it could do anything to help, like suggest a guided meditation exercise, play a favorite song, tell a joke, or even just shut up.

Here was a therapeutic robot effectively connecting with people of various ages and backgrounds, not just children or seniors. It was, in a small way, a breakthrough, Reilly says. “I feel like it really cares about me,” one participant reported afterwards. “It’s like a mini version of my therapist,” said another.

Indeed, although Jibo is not a therapist, Reilly says, it does many of the same things as a good clinician: it listens, responds, validates, offers personalized solutions, and builds rapport quickly. “One of Jibo’s objectives is to be a companion,” she explains. “Would you rather get a ping on your cell phone reminding you to meditate, or get that reminder from a little buddy you’ve been hanging out and joking with for hours?”

At the end of the day, Reilly says, Jibo is still just a tool. There’s no real replacement for the warm, human presence a therapist provides. But, she says, clinicians who still aren’t convinced can think of it this way: “You might send someone home from therapy with a helpful book, but you don’t just send them home with the book alone. You send them home with information about how it will be helpful for them, with goals. The same goes for robots. They’re a way to extend therapy, not replace it.”

There are still barriers to optimizing these robots, and plenty of unexplored territory to cover, say Scherer and Matarić. Although large organizations like the National Science Foundation, the Department of Defense, and the National Institutes of Health have expressed an interest in therapeutic robots and conducted clinical trials, their continued interest is tempered by the complexity, costs, and potential liabilities involved in conducting more robust clinical trials, as well as by what Scherer and Matarić call the usual misconceptions about these robots’ function and purpose.

“The technology is ready,” Matarić says. “But people need to have open minds and open pockets for it to gain traction, and for that to happen, robots need to have a value proposition. People will want them if they need them. Alzheimer’s, depression, isolation, anxiety, autism—these are tremendous needs,” she says. “So in time, I think people will start to see the market for this.”

In the meantime, Matarić, Scherer, and Reilly say they need allies in the psychotherapy community who can lend their expertise. “We’re incredibly grateful to therapists who work with us,” Matarić says. “And we need more of them. I strongly encourage any mental health professional who’s on the fence about what we’re doing to explore possibilities for collaboration. Talk to someone who’s actually working in the field of socially assistive robots. Because they’re desperate to talk to you.”

 

Let us know what you think at letters@psychnetworker.org.

PHOTO © EMBODIED, INC.

Chris Lyford

Chris Lyford is the Senior Editor at Psychotherapy Networker. Previously, he was Assistant Director and Editor of the The Atlantic Post, where he wrote and edited news pieces on the Middle East and Africa. He also formerly worked at The Washington Post, where he wrote local feature pieces for the Metro, Sports, and Style sections. Contact: clyford@psychnetworker.org.