Among the symptoms of dementia is a phenomenon called “sundowners syndrome”: an increase in agitation, confusion, and anxiety as late afternoon transitions to evening. Its cause isn’t well understood; circadian rhythm disruptions precipitated by the change in light, anxiety over end-of-day activity, and hormonal fluctuations have all been floated as theories. Whatever the trigger, sundowners can make otherwise amiable people combative and even violent, a frightening and unsettling experience for patients and caregivers alike.
Staff in hospitals and nursing homes typically treat the symptoms with sedative drugs. But in recent years, facilities from Japan to the US have turned instead to a specialist: a robot baby seal named Paro.
Paro spent a decade in development at Japan’s National Institute of Advanced Industrial Science and Technology. The robot seal came to market in 2004 and is now in use in many parts of Asia, Europe, and North America to offer the psychological benefits of pet therapy in situations where a real animal isn’t practical. A new Paro costs about $6,000, pricey for an individual consumer but more accessible for group facilities. People who work with seniors with dementia describe Paro as a furry little miracle. Sandra Petersen, a professor of nursing at the University of Texas-Tyler, had a patient with advanced Alzheimer’s who hadn’t spoken in eight years. The woman picked up a Paro, cradled it in her arms, and whispered “I love you” into its fur.
Paro responds to touch and sound, and makes cooing noises modeled after those of real baby harp seals. If you stroke Paro and talk softly to it, it will gurgle and turn toward you. If you speak to it sharply, it will immediately stop whatever it’s doing and try something else in an effort to please you. The choice of animal is deliberate: A robot dog might evoke a frightening childhood memory of a snarl or a bite, but who’s ever had a bad run-in with a baby seal?
Paro never needs to be fed or walked; it never jumps, scratches, or growls; its companionship can be summoned at any time of the day or night. That might seem elementary, but one of the most valuable things a robot or artificial intelligence can do for the elderly is to simply be present: constantly, tirelessly, consistently present. A caregiving AI needs no sleep, never gets sick or distracted, has no obligation apart from its service. It accomplishes the essential task of caregiving: placing the care recipient at the center of one’s attention.
Other robots have been trialed as companions to the elderly, but none confer the social and psychological benefits of a Paro. Paro alleviates the intense anxiety and agitation that frequently accompany dementia. A patient holding a Paro is less likely to wander, another common hazard of the disease. “I’ve been in this field for 25-plus years,” says Randy Griffin, a nurse who created the first Paro training program in the US. “There isn’t any other thing we have in dementia care today that [is effective] at every stage of this illness.”
“The thing with pets is that they’re living and sentient and may have an off day,” says Kathy Martyn, a lecturer in health sciences at the University of Brighton. “They may not want to perform at three in the morning or on a hot afternoon.”
Martyn and I meet in a small conference room on Brighton’s Falmer campus, in East Sussex, a few miles from the English Channel. A Paro sits squawking on a table between us, craning its head quizzically toward the sound of our voices with a faint, motorized whirr.
Throughout the conversation we each reach out from time to time to stroke Paro’s fur or give it a scratch behind the ears. At one point I involuntarily make a goofy face at it, the way one does to win over a puppy or a baby.
Paro is 22.4 inches (56.9 cm) long and a smidge under 6 lbs (2.7 kgs), about the size and weight of a human infant. With Martyn’s permission, I pick Paro up and hold it against my chest, as patients often do. It immediately evokes the muscle memory of hours spent rocking small humans of similar size, to a degree that brings an unexpected lump to my throat.
That, too, is part of Paro’s design. Caring for others is an essential part of being human. It’s also one of the first things to be taken away as a person’s physical and mental reflexes slow.
“Often you’ll see someone petting Paro and he’ll cry, and they’ll say, ‘Don’t cry.’ They’re taking care of something. He gives the person the ability to give love, as well as receive love,” Griffin says. (Paro officially has no gender, though owners tend to assign it one.)
Petersen concurs. “One of the things I’ve found in my 30 years in practice is that if you can provide purpose to someone, it gives them meaning,” she says. “There is that sense in all of us to nurture or protect something that’s helpless.”
One of the guiding principles of robotics and artificial intelligence in health care is that robots—or, more accurately, the people who design and deploy them—should not practice deception. Put a different way, it’s not ethical to let someone believe they’re having an interaction with a real human or animal when the thing they are engaging with is, in fact, a robot.
Health care providers who use Paro in their practice say they introduce the device factually, with different degrees of clarity depending on the person they’re working with. Some will explain that it is a robot. Others will simply say, “this is Paro.”
“We never present him as anything other than what he is: a robotic seal,” Martyn says. “He’s an aide. He’s part of the toolkit we use to manage agitation and distress.” But because of their various cognitive disabilities, not all patients are able to distinguish between what’s real and what’s not. Some confirm that they recognize Paro is a toy. Others interact with it as if it were a living animal. Sometimes patients call the robot by the name of a child, or a cat they used to have.
“I’ve been confronted by people about that,” Petersen says, of the ethics of allowing someone to believe they are cuddling a real pet. “I always look at everything in medicine in terms of risk/benefit. That’s how I look at AI. Is the risk greater than the potential benefit to the patient? I haven’t found a case [with Paro] where it is.”
People concerned about the harm a relationship with Paro might do to a person with dementia do not understand the gravity of the disease, Petersen says. “You come to a point in dementia where you can’t trust yourself,” she says. “It’s like being dropped in another country where you don’t speak the language, you don’t know what time of day it is. It’s terribly, terribly fear-producing. These people live in a constant state of fear because they can’t figure out what to do next. They know something’s wrong but they can’t figure out what it is. And it never ends for them. I don’t think people realize how horrific it is.”
The sedatives typically used to treat that agitation can themselves increase the risk of falls, infection, and further confusion. “Most of my patients are on an average of 14 to 28 medicines a day,” Petersen says. “If I can use a robot to control symptomatology rather than four or five pills that person might take to control anxiety, why wouldn’t I use that?”
To a person in normal cognitive health, Paro is unmistakably a machine. A soft mechanical sound accompanies its motions; up close, you can see its whiskers have tiny sensors on the ends. Given the comfort it brings to people suffering a dreadful disease, insisting that patients recognize its artificiality seems cold and beside the point.
But you don’t have to peer very far into the future to see the possibility of interactions in which it will be difficult even for a person with their full cognitive facilities to tell the difference between robots and reality.
The Auckland, New Zealand-based tech company Soul Machines creates AI interfaces that look uncannily like high-definition video chats with a real human being. It doesn’t quite pass the Turing Test, but it’s easy to imagine a situation in which someone with limited eyesight or cognitive disabilities believes they’re having a human conversation when talking to a robot like “Ava.”
Or “Sarah.”
Or this baby.
Soul Machines licenses its user interface technology to businesses and institutions. Its technology has powered digital assistants for banks, airlines, and software companies, as well as a prototype virtual assistant, voiced by the actor Cate Blanchett, that was designed to help people with disabilities navigate Australia’s public benefits system. (That program was shelved, not long after the Australian government’s disastrous introduction of an automated system to detect welfare fraud drew public outcry.) Soul Machines has discussed services for the elderly with prospective clients but has not announced any partnerships on that subject to date, says chief business officer Greg Cross.
Soul Machines envisions a future in which digital instructors educate students without access to quality human teachers, and in which famous deceased artists are digitally resurrected to discuss their works in museums. Robot companions for the infirm, then, are not too far a leap. Nor is the prospect of a future in which a family converses with the lively AI recreation of a person suffering from dementia, while a caregiver—robot or human—tends to their ailing body in another room.
The potential for deception is already here. A few years ago, Brent Lawson, the president of 1 AM Dolls, a manufacturer of life-sized rubber sex dolls, was on the phone with a client who wanted a specific doll he’d seen on the company’s website. The man was particularly concerned that the doll’s hair was just so, and peppered Lawson with questions about the color and style, Lawson told Quartz. (He’d previously shared the story on journalist Jon Ronson’s podcast “The Butterfly Effect.”) But when Lawson asked how the caller wanted to personalize the doll’s more intimate features, the man stopped him short.
He didn’t need a doll for sex, he explained. He needed a doll that resembled his late sister. The man was currently caring for his elderly mother in the late stages of dementia. The ailing woman asked for his sister every day, and he could no longer bear to keep breaking the news to her that her daughter was dead.
“The mother had Alzheimer’s and dementia, and apparently her sight wasn’t so great, so rather than having to explain every day that the sister wasn’t there he could just point across the room and say, ‘She’s right there, and she’s taking a nap,’” Lawson says. In an industry of unusual requests, that remains the most unusual one Lawson has ever received. It stuck with him for a lot of reasons, not least because the memories it raised of visiting his own grandmother when she was in the late stages of Alzheimer’s, and her vision was gone, and she no longer knew her family.
“There were times when I’d go see her and she couldn’t tell my dad and I apart,” Lawson says, “so she would call me her son’s name and call [my] dad my name.”
A dystopian future in which robot sex dolls babysit our Nanas is appalling. It’s also extremely unlikely.
An often-overlooked fact in the discussion of AI’s future role in eldercare is that by the time our parents or grandparents have robot companions or assistants, so will our children, and so will we. So it makes sense that much of the research on AI’s potential to support aging individuals is focused not on building virtual friends, but on helping elderly individuals and their caregivers with practical tasks, especially those that help an older person age in their own home.
ElliQ, a “social robot” designed for use by older people, is one such prototype. With a design slightly reminiscent of the EVE robot in Pixar’s film WALL-E, the desktop robot offers reminders for appointments and medication times, books car rides, plays music and audiobooks, and nudges users to accomplish pre-set goals like going out for exercise or calling a friend. The settings can be adjusted so ElliQ alerts caregivers or family members alerted via app if it hasn’t detected any user activity in a specified amount of time. In the most extreme cases, authorized caregivers can access the robot’s camera and microphone and check on the older adult directly.
A virtual key to an aging parent’s house is just like a physical one. Used indiscriminately, it’s an invasion of privacy; used with discretion, it’s helpful for both parties. In fact, there are some AI applications that may give users a greater sense of privacy, rather than a diminished one. In one small survey of older people undertaken in 2016, respondents preferred getting help from a machine rather than a human in 28 out of 48 daily tasks, especially mundane manual ones like finding their keys or cleaning their home. (For personal care tasks like bathing or shaving, respondents tended to want people to help them, not machines.)
In this school of design, the best use of machine intelligence is to take on the physical tasks of caregiving and free up human capital for the relational aspects.
“We want to leverage what people do best: the compassion, the empathy, the human touch. I don’t think anyone wants to remove that,” said Conor McGinn, a professor at Trinity College Dublin and head of a team developing Stevie, a service robot designed for use in care homes and hospitals.
But the temptation to outsource some of the emotional labor of caregiving to AI is going to be intense. By 2050 there will be as many adults age 65 and older in the US as there are children under 18. Today, that’s an unprecedented demographic threshold. But not for long: Many industrialized countries will reach it even sooner than the US.
Obviously not all of those individuals over the age of 65 will be in need of care, and many people will live healthy, independent lives for decades past “retirement age.” But most people lucky enough to live long lives will, at some point, reach a stage in which they require regular assistance with the tasks of daily life—assistance that, in the US, is provided by a family member 80% of the time.
Ensuring that an elderly person is dressed, fed, bathed, exercised, and properly medicated each day is a full-time job that can strain a caregiver’s health, job, and relationships. An adult child may understandably become frustrated when a parent with advanced dementia lashes out or asks the same anxious question repeatedly. AI never will. People with dementia can be violent or demanding as the disease takes its toll, which can be painful to see in a once-loving parent and infuriating in a previously cold, abusive, or difficult one. Add in work, kids, and other demands on a caregiver’s time, and the desire to hand over some of that responsibility to someone, anyone—even a robot—becomes more urgent.
The benefits of AI assistance are often framed as a compromise. For an agitated dementia patient, a Paro session is preferable to a dose of sedatives. For a person without a thriving social support network or access to 24-7 care, a companion robot is preferable to interminable loneliness.
“Of course a human would be far better for emotional support and interaction,” says Norman Winarsky, an entrepreneur and co-creator of Apple’s Siri. “But if you have nothing and you say to the robot, ‘hey, tell me a joke today, or please pick up the newspaper’…it would be better than nothing.”
However, the better robots get at keeping us company, the less incentive there will be to allocate human resources to that task, or to work on systemic changes that could make human care and companionship more readily accessible for everyone. If you’re a parent and have ever allowed a child’s TV or iPad time to run past the previously determined limit, just so you could get a few extra minutes to finish a task or catch your breath, then you’re aware of the unbalanced equation between the kind of caregivers we want to be and the actual stores of time and energy we have.
Loneliness is already a crisis for older people. The idea that any of the limited human interaction many older people have could be outsourced to machines is unsettling, and unacceptable to many critics of the rush to AI solutions. Is our goal to never be inconvenienced by anything? Shouldn’t the people we love be the one thing that’s worth our time and trouble?
The role robots should play in the care and support of aging people is just one thread of a far more complex discussion about the emotional bonds between humans and machines. Robotics and artificial intelligence will make many aspects of our life simpler, and it’s not entirely clear what, if anything, we lose when we outsource love, care, and intimacy to something that can’t actually feel those things in return.
In her book Alone Together: Why We Expect More from Technology and Less from Each Other, MIT professor Sherry Turkle argues that by embracing machines that satiate our human need for connection, but do not actually care about us, we are creating a new kind of relationship, one “that make us feel connected although we are alone.” Her research is also a reminder that the automation of emotional labor is a two-way street, both for people in caregiving roles and those we think of as in need of care.
“The demands that our friends—or even pets—make on us are…unpredictable, sometimes unexpected, and often inconvenient,” wrote Robert Sparrow, a philosophy professor at Australia’s Monash University, and Linda Sparrow, in a 2006 paper (pdf). “This is an essential part of what makes relationships with other people, or animals, interesting, involving, and rewarding.” When it comes to AI-assisted care, they wrote, “Any reduction of what is often already minimal human contact would, in our view, be indefensible.”
In an email to Quartz, Robert Sparrow clarified: “If these machines ever get to the stage where they can actually help people remain in their own homes longer then I don’t really have a problem with people choosing to use them. But I do worry both that this technologically centered project is drawing attention and funding away from other more low-tech approaches to addressing the social and medical needs of older people and that it risks creating a world where people don’t have an option but to be cared for by machines.”
As part of Turkle’s field research, she and her team gave a robotic infant called “My Real Baby” to children and elderly people and interviewed them about the experience. Many children who were smitten with the robot themselves balked at the idea of giving one to their grandparents. “I know this sounds freaky, but I’m a little jealous,” said one 14-year-old girl. “I don’t like it that I could be replaced by a robot, but I see how I could be.” The researchers later visited an 82-year-old woman named Edna and presented her with My Real Baby. At the time, Edna’s two-year-old great-granddaughter—upon whom she typically doted—was also visiting. Over the course of several hours, the researchers and Edna’s family watched in astonishment as the older woman ignored the human toddler’s pleas for food and attention, while fussing over the electronic baby’s cries.
For a person frustrated by the increasing difficulty of daily life, Turkle wrote, “My Real Baby’s demands seem to suit her better than those of her great-granddaughter…My Real Baby gives her confidence that she is in a landscape where she can get things right.”
The appeal of robots as providers and recipients of care may lie in part in their comparative simplicity to us humans. They are free of the struggle that lies at the heart of our most intimate relationships, the one between our all-too-human limits—of time, insight, understanding, patience—and our desperation to get things right.