The United Kingdom’s decision in a referendum to withdraw from the European Union will transform the legal rights of its citizens and Europeans hoping to live and work in the UK. But there’s one other demographic that could be legally affected by Brexit: Robots.
Last month, the European Parliament’s legal affairs committee published a draft report calling for the EU to vote on whether robots should be legally considered “electronic persons with specific rights and obligations.”
The report, led by Member of the European Parliament (MEP) Mady Delvaux from Luxembourg, notes that robot autonomy raises questions of legal liability. Who would be responsible if, for example, an autonomous robot went rogue and caused physical harm?
The proposed solution is to give robots legal responsibility, with the most sophisticated machines able to trade money and claim intellectual copyright. Meanwhile, the MEPs write, human owners should pay insurance premiums into a state fund to cover the cost of potential damages.
These plans explicitly draw on the “three laws of robotics” set out by the 20th-century science fiction writer Isaac Asimov. (A robot may not injure a human being; A robot must obey human orders unless this would cause harm to another human; A robot must protect its own existence as long as this does not cause harm to humans.)
Though rights for robots may sound far-fetched, the MEPs write that robots’ autonomy raises legal questions of “whether they should be regarded as natural persons, legal persons, animals or objects—or whether a new category should be created.” They warn of a Skynet-like future:
“Ultimately there is a possibility that within the space of a few decades AI could surpass human intellectual capacity in a manner which, if not prepared for, could pose a challenge to humanity’s capacity to control its own creation and, consequently, perhaps also to its capacity to be in charge of its own destiny and to ensure the survival of the species.”
Peter McOwan, a computer science professor at Queen Mary University of London, says rights for autonomous robots may not be legally necessary yet. “However I think it’s probably sensible to start thinking about these issues now as robotics is going through a massive revolution currently with improvements in intelligence and the ways we interact with them,” he says. “Having a framework about what we would and wouldn’t want robots to be ‘forced to do ‘ is useful to help frame their development.”
John Danaher, law lecturer at NUI Galway university in Ireland, with a focus on emerging technologies, says that the proposed robot rights are similar to the legal personhood awarded to corporations. Companies are legally able to enter contracts, own property, and be sued, although all their decisions are determined by humans. “It seems to me that the EU are just proposing something similar for robots,” he says.
Both professors say they had not heard of any comparable legal plans to draw up robot rights within the UK.
As Britain makes plans to withdraw from the EU, MEPs will vote on the robot proposals within the next year. If passed, it will then take further time for the plans to be drawn up as laws and be implemented. By that time, the UK may well have left the union. So for machines in the UK, Brexit could mean they’ve lost out on the chance for robot rights.