If you could construct a sexual partner that was faithful, beautiful, and responsive to your every wish, would you?
It’s a question Aimee van Wynsberghe, co-founder of the Foundation for Responsible Robotics, thinks a lot about. In July 2017, she and fellow ethicist Noel Sharkey published a report (pdf), Our Sexual Future with Robots, that delved into the state of the robot sex industry and its future.
Quartz met van Wynsberghe, a professor of robotics and ethics at the Delft University of Technology in the Netherlands, on a trip to London in a busy café, just before she headed to the Science Museum’s Robots exhibition, to discuss how close humanity is to sex and even love with robots, and the risks involved. The interview is edited and condensed for clarity.
Quartz: Your report mainly deals with “precursors” to sex robots. How are the dolls and devices that already exist connected to possible robots of the future?
Van Wynsberghe: The precursors—like sex dolls—are interesting because they are beginning to shape our views. You can purchase an individual online, and you can have different characteristics. If you like elf ears, that’s fine. You can choose the style of pubic hair, the size of the bosom. It’s strictly a doll in that it doesn’t mimic an orgasm, and it doesn’t talk to you or anything like that. It’s really a “utensil” to use for sexual gratification. This starts a process of choosing exactly what you want it to look like, but you have to use your imagination for an interaction.
Now, [moving into robotics] we start adding functionalities to the doll: Allowing the capability for the male doll to go from a flaccid to an erect penis, or the female doll to simulate an orgasm, or to speak to you, or to tilt her head. To recognize your voice, to recognize your face, and to be able to interact in that way. It’s the same idea, but there are important differences. It’s no longer in your imagination. You’re not pretending that the doll is talking to you, the robot is literally talking to you. So there are similarities: you’re still choosing the appearance, you’re choosing different capabilities, you are shaping what you want you sexual partner to look like, and to act like, and to be. But now it’s much, much more sophisticated.
Quartz: So then it’s about combining technologies that allow interaction with the dolls. Is that happening?
There’s one company that’s really leading sex robotics, Abyss Creations, which is making the Harmony robot. It utilizes several different platforms. An app on your phone allows for the collection of preferences and learning about you. This information is then transferred to the doll. A lot of the work of the robot is done through the app.
In parallel to the sex robot industry, which is led by the porn industry, you have artificial intelligence [AI] exploding. The AI does the facial recognition, the voice recognition, the learning algorithms to try and pick up on how you express emotions, or to collect your physiological data: If your heart rate increases when you’re talking about S&M [sadism and masochism], maybe that’s what you like, and the robot could react.
These things are happening in parallel. It’s only a matter of time before some very smart porn industry person says, “Oh, beautiful. We’re going to take this [AI], and put it into the robot.” And then once that happens we will be in the thick of it.
We already have brothels where you can go and see a doll. All you have to do is replace a doll with a robot, and you see how many people want to go to the robot versus the doll.
Quartz: Are we going to see a sudden mushrooming or is this a niche thing?
On one side people are saying: “This is disgusting, you can’t do this, this is sacrilegious.” And then you have the other side, with people saying: “Well, go ahead. If this is something that you’re interested in, you’re not harming anyone, you’re not cheating on your spouse, why not go for it?”
In the same way that birth control became normalized, in the same way that the dildo and other sex toys became normalized, we could see this become normal. Or, if we continue to have the porn industry leading the design of these robots, we’ll continue to see robots that are very pornographic, based on objectification and even exploitation, especially of the female body. If it stays in that realm, it might be more of a niche thing, where only people who are interested in that kind of image want to use the robots.
Quartz: It’s been suggested that sex robots could be of some benefit…
There there are a lot of statements being made about at the moment about possible therapeutic uses for sex robots. Suggestions that this could be a compliment to someone’s healing therapy if they’ve experienced a sexual trauma, and can’t have sex with another human. For men who have erectile dysfunction, or premature ejaculation, this could be something that helps them, perhaps in parallel with seeing a psychologist. There are suggestions these inventions could be good news for disabled persons, for elderly persons.
There are already charities that provide sexual helpers to people with disabilities.
You could imagine going that route, finding out if a doll or robot is something that a person with disabilities might be interested in. A responsible, slow introduction, where robots are tested appropriately, could make the use of robots a more broadly accepted practice, rather than a niche. And if we start to involve these groups, then we put pressure on the industry, by saying, “No. We don’t want the pornified individual. We want an older-looking individual, because we have 70-year-clients who are interested in the technology.”
Quartz: What about the argument that this could replace prostitution, freeing people from that kind of sex work?
For the report we talked with some sex workers. They told us that their clients want them to do drugs, to get drunk; they want to hear their story about how they got into the business. From that, it would be a huge leap for these robots to replace prostitution.
Again, there are a lot of sweeping statements about the idea that having robots available could mitigate human trafficking, or exploitation of children by pedophiles. Those claims are interesting. A human rights lawyer we consulted noted that trafficking has a lot to do with domination and power over another individual: things you can’t experience with a robot.
There is one company making a doll with a setting where the doll rejects your advances. But still, that’s very different from complete domination of another individual.
It’s a nice idea that this could help alleviate human trafficking. But how do you prove it? How are you going to run tests, which heads of child trafficking operations are we going to have fill in a questionnaire for us on whether or not use of a robot would help them? The pedophile example is the same.
Quartz: There are some court cases now that touch on that example, right?
Right now there’s a company in Japan that makes dolls [that can be used for sex]—not robots but dolls—that look like children. A UK case which resulted in prosecution in June involved a man who had ordered one of these child-like dolls from Japan which was intercepted coming into the country. There’s also a case right now in Newfoundland in Canada, in which a child-like sex doll was intercepted at customs. That individual is being charged with possession of child pornography and misuse of the mail system.
The question now in Canada is whether or not this constitutes child pornography. Because the legal definition says that a child was harmed in the making. So now Canada is in a very interesting position where they have to decide whether or not their definition of child pornography needs to change, or whether they need a separate rule for items like this.
In the UK case, the man being prosecuted had horrific child pornography in his home. The man in Newfoundland, they couldn’t find anything in his home, so it was just this instance of the doll. So that case is back in court right now, and experts are discussing whether or not the doll is actually a childlike representation, or just a small woman. So they’re talking about the breasts, you know, are they buds or are they actually small breasts.
With cases like this, an argument has been that if dolls like this are available, perhaps pedophiles will use them instead of harming children. That’s what the founder of the company in Japan actually says. But how do you prove it? It’s an empirical question, and you need a control group, and you need to follow individuals, and you need to have the group that uses the doll, and the group that uses the children, and you have to make comparisons.
It’s incredibly unethical to actually run these studies.
Psychology has a term: The “reinforcing effect.” That theory would suggest this technology could be harmful to children. Because the doll won’t be enough: It won’t satiate the fetishes of people who have them, and it will also normalize those fetishes. By allowing the dolls to exist, as a society we’re indicating that people who want to pursue these fantasies aren’t doing anything wrong. When really we have agreed—legally, morally—that you don’t abuse or treat children in this way.
Quartz: Your background is in cell biology and later ethics. How does that inform your work now?
I’ve been looking into robots now for years, since about 2004, when surgical robots first came on the scene. I was part of a technical team training surgeons to work on a patient’s body remotely. We weren’t experimenting on humans, though, but in lab conditions. Now I’m doing a lot in humanitarian drones.
I don’t know if it’s because of people’s fascination with the word “robot,” or the concept robot or the pop culture image, but I continue to notice that we don’t have the same systematic approach to testing and validation and verification with robots as we do with drugs, or pharmaceutical or healthcare industry, or even standardization like that for alarm clocks or any electrical appliances that are in your home.
We’re so excited about this technology and ready to try it, that we’re also ready to let society be guinea pigs in an experiment. We do that rather than say, “It might slow down innovation, but we’re going to do this right.”
Quartz: Are you excited about a future of robots?
I’m excited by certain types of robots, and certain capabilities that robots will have. I’m worried about the concept of “moral deskilling”—that we could lose some of what it is to be human. We won’t be as practiced at looking people in the eye, and trying to understand where they’re coming from.
Imagine someone going through a tough time is sitting in front of you crying. You’re late for your appointment, but you’re going to sit there and be with them. Increasingly we might not have those opportunities, and if we’re suddenly confronted with them we won’t know how to deal with it.
Quartz: Specifically because of robots?
Because of devices in general—the things that are raising the threshold for having real interpersonal interactions. And I think robots are just going to raise it higher. If you can have sex with a robot, if robots can read your kids bedtime stories…I have two kids at home, and yes it’s tiring and you have to read the same book every day for two months. But it is fundamental to your relationship, to their development. Why would we be looking at taking those things away?
On the other hand, yes I want a robot to clean my house. I’m excited about humanitarian drones that can bring blood samples and vaccines to places we can’t physically reach. I’m excited about healthcare robots that can monitor and pick up on cues, the AI algorithm that can predict whether or not your mole is cancerous better than your surgeon can. That’s exciting.
I am not excited about the AI that can predict whether or not you are susceptible for committing suicide. I don’t think that’s the kind of thing we should delegate to AI.
The underlying concern is that we’re using technology as a band aid, and robots are part of that. That we put technology in when there’s a problem, and believe it alone can solve the problem.
Quartz: You’ve talked about robot portrayals in fiction like the recent series Westworld. Are we able to be so excited, to let society be guinea pigs, because we’ve already seen it?
What’s so fascinating about Westworld is that it shows you how ugly humans are. It did a beautiful job of saying: The robot is a mirror. And showing you: This is what you are, this is what you could do if there were no consequences for your actions.
Western fiction has a rich history of robots taking over the world. They kill us, or we lose control over them. A lot of our fears stem from what we’ve been shown in pop culture. But in Westworld and elsewhere we also see these perfect, human-like beings (which are, of course, actually portrayed by humans.) And it’s just so fascinating, and of course we want it. Our reaction is “Do what you have to do so that we can get to that.”
There are also cultural differences. In Japan you don’t have the same portrayal of the robot. The robot is a friendly companion that you’re not worried about. It’s not going to end the world, it’s going to save the world. It can have a soul.
Quartz: So who should decide how the industry develops?
One aim of the Foundation is to be a bridge between the public and companies. We want to create standards. In the same way that when you’re buying coffee or tea you have a fair trade label. Buying products now, especially food, is becoming an ethical endeavor. We’re in a process of developing a similar kind of system so that when someone were to go to the store and purchase a robot it could have a stamp that said “approved data protection policy” or “environmentally sustainable.”
That could also be a way to educate the public, to say: “What kind of product do you want to be a part of buying?” Hopefully, that would put pressure on the companies, for example to do more rigorous testing.
Quartz: Has anything like that worked before?
One example is genetically modified organisms (GMOs). There was a time in which you didn’t have to label whether or not pesticides were used, or whether something was a GMO, or whether it was organic. Now, Michelin Star restaurants exist that are all about local products. There’s the Slow Food movement. We’re seeing a shift away from the terrible treatment of animals. We’ve seen how the food industry has gone in one direction and is now trying to resist against that. And so with robots we’re thinking, let’s not go in that direction, let’s start a little closer.
Quartz: And are there areas where we’ve already gone too far?
Another parallel technology is teledildonics. The “tele” part indicates that there’s a telelcommunications network involved, so the idea is that partners could use devices to have sexual intercourse at a distance.
But the kind of information and the amount of information that a dildo can pick up— temperature of the vagina, length of time before getting to an orgasm, all of these things that a company has access to—that’s very personal information. And right now there is no sort of policy or best practice on whether a company is allowed to collect that information, to share it, store it, use it, destroy it. There’s nothing. In fact, a class action has been brought against a company in Canada for collecting data without knowledge that the data was being collected or what it would be used for.
That’s symptomatic of the robotics world. When you go into a store and you have a robot to show you where the hammer is; or you go into a bank and a robot asks “Are you here to see a cashier or for a loan?”—that’s all very valuable information for advertisers and marketers.
We’re all going to fall into this data trap, where the most valuable part of the robot is the data that it collects.
This could be a great time for saying, “Ok, let’s get some protections.” Especially for people in need, like elderly people or disabled people using the technology, or allowing technology into a country which looks like a child.
Quartz: And if we don’t?
Imagine trying to roll back the internet.