Scientists believe they’ve nailed the combination that could help robots feel love

First they love, then they long.
First they love, then they long.
Image: Reuters/Gleb Garanich
We may earn a commission from links on this page.

The proposal to open Café fellatio, an establishment in Geneva, Switzerland where men would be able to get oral sex while drinking their coffee, was met with no uncertain outrage. And city authorities have decided it’s also against Swiss law. So Bradley Charvet of Facegirl.ch, the escort service behind the project, adjusted the details of the project: he would use sex robots instead of human prostitutes.

It’s not clear what the robots would look like or what they’d be able to do. The Geneva authorities have also yet to make up their mind whether that’s an acceptable solution. On the one hand, you could argue that these sorts of robots, presumably looking as human-like as possible, are nothing more than technologically advanced sex toys—the dildos and fleshlights of the digital age. On the other hand, if something is even remotely human-like, shouldn’t we take pains not to dehumanize it in this way?

These questions are about to become more complex. Hooman Samani, a director of the Artificial Intelligence and Robotics Technology Laboratory at National Taipei University, Taiwan, built a machine that can respond emotionally to how humans treat it.

Samani calls his professional line of inquiry “lovotics” and says “it’s about connecting the dots from two different fields. One is robotics, which includes artificial intelligence and mechanical engineering, and the other one is the science behind human love.”

The latest achievement of lovotics looks like a furry hat with cleverly fitted cameras, microphones, tactile sensors, speakers, and colorful diodes, all powered by an Intel Atom processor, driving around on tiny wheels.

The lovotics robot.
The lovotics robot.
Image: Hooman Samani

The hardware isn’t exactly thrilling; what’s unique about this machine is its very peculiar kind of artificial intelligence. Samani, along with Elham Saadatian, a human-computer interaction specialist, laid down the underlying AI theory in the paper published back in 2012. The researchers began by trying to answer the question of how love works from a purely scientific point of view. Their solution: “The internal experience of love can be traced back to our endocrine system,” says Samani. “The way we feel about others is to a significant degree determined by hormones.” So, he fitted his robot with digital hormones.

Both our real endocrine system and Samani’s artificial one consists of two sets of hormones—biological and emotional—working simultaneously.

Biological hormones like melatonin, norepinephrine, epinephrine, orexin, and ghrelin regulate things like drowsiness, blood pressure, blood glucose, heartbeat and rate, and appetite in humans. “We tried to mimic that in our robot. For example, increased level of ghrelin makes you feel hungry. Digital ghrelin makes the robot want to charge its battery, and so on,” says Samani.

Emotional hormones, meanwhile, govern emotional states: dopamine for excitement and alertness, serotonin for happiness and tension, endorphin for our sense of well-being, and finally, oxytocin for trust, empathy and love. “From the physiological standpoint, oxytocin is the hormone of love,” says Samani. “Its level increases when we interact with our beloved ones. It decreases when we are deprived of this interaction.” That, too, was simulated in the AI.

Both biological and emotional hormonal systems are organized as structures called “dynamic Bayesian networks,” or DBNs. A Bayesian network is all about probabilities and conditional dependencies. For example, a Bayesian network trained to diagnose illnesses can take a set of symptoms and calculate how probable it is that you have a particular disease. The word “dynamic” simply means a system where calculations are done nearly constantly, say every 1/100th of a second. That enables DBNs to work in real-time, and model the interplay between many different variables that influence each other.

To figure out the rules that govern the hormonal system’s DBN “we turned to psychology,” says Samani. His robot processes visual, auditory, and tactile input to figure out the user’s attitude towards it, and tries to categorize those attitudes into various behaviors that psychologists have identified as signifying love (or lack thereof): proximity, attachment, repeated exposure, and mirroring. Then, the robot releases the right combination of “hormones” to adjust its internal state in response.

“Internal states, in turn, affect the external behavior in an indirect way. Hunger doesn’t automatically make you look for food, it just incentivizes you to do so. It works the same in our robot,” explains Samani. The machine can also recognize its users and distinguish between them. It won’t fall in love in anybody else, so no need to be jealous.

But when it does “fall in love” with you, what does that actually mean? Is it the same way we humans love each other?

“The reality is, we can’t be sure,” says Peter Voss, a founder of SmartAction, an AI company based in El Segundo, California. Voss is Nagelian in his worldview: “It’s all about subjectivity of experience,” he says. “Look, I’m a man, and I have absolutely no way of knowing how it would feel to have sex if I was a woman. A woman can describe that to me, but I can only relate to her description through my own experience. And a machine differs from you much more than your wife.”

Samani’s philosophy is a bit more practical. “I’d say this robot is still an object and we have a long way to go before it becomes something more,” he says. Being down to earth, however, didn’t stop him from designing an experiment to find out what love with a robot would look like.

Twenty participants, 10 men and 10 women, interacted with Samani’s robot for two hours each in a lab. Afterwards, they were asked to fill out a slightly adjusted “Love Attitude Scale” questionnaire—a tool commonly used in psychology to measure to what extent a couple’s relationship is based on a number of factors: friendship, physical attraction, having fun, commitment, and being unhealthily obsessed with a partner.

Based on the results, the most dominant form of love in human-robot relationships seems to be the so-called “ludus love,” the type that develops out having fun together. Of course, that might be because the robot is physically designed as a playful fuzzball. There are other pretty obvious limitations of this project: “The study was limited both in sample size and interaction time, but we treated it as the first step, a proof of concept. More thorough research will surely follow,” says Samani.

Meanwhile, Samani is seriously considering commercializing his technology. “There clearly is a demand for such devices. I know people that buy robotic vacuum cleaners, decorate them, give them names, and treat them as though they were sentient beings,” he says. “It’s just anthropomorphism, attributing human characteristics to objects.”  Or maybe it’s just what loneliness looks like nowadays.

And robots will deal with our loneliness like it’s never been dealt with before, at least in the opinion of David Levy, the British artificial intelligence guru. In his 2008 book Love and Sex With Robots, Levy wrote that in the near future, “love with robots will be as normal as love with other humans, while the number of sexual acts and lovemaking positions commonly practiced between humans will be extended, as robots teach more than is in all of the world’s published sex manuals combined.”

“Yes, it can become a reality one day,” says Samani. But, he adds, “What we can do is one thing. What we should do is quite another.”