OVERSHARING

DeepMind held 1.6 million people’s health records unlawfully, but no one’s getting fined for it

Google’s cutting-edge artificial intelligence unit DeepMind was unlawfully given the health records of 1.6 million patients at London’s Royal Free hospital in 2015, the United Kingdom’s data protection authority has found.

The Information Commissioner’s Office (ICO) said the hospital failed to comply with data protection law, and said DeepMind’s processing of the data was “not lawful.” Information commissioner Elizabeth Denham tried to offset criticism that the ruling hampers innovation, saying in a statement today that “creative use of data” could benefit patients, but that “the price of innovation does not need to be the erosion of fundamental privacy rights.”

Her office’s report said the deal contravened four data protection principles. These include the fair and lawful processing of data, ensuring the data processed is relevant and not excessive in quantity, and ensuring adequate controls are in place.

Now, the ICO has asked the Royal Free to perform a third-party audit of the data-sharing trial between DeepMind and the hospital, with the right to make the results public, among other measures. The Royal Free has signed an undertaking presented to it by the ICO, agreeing to to these measures. But it won’t be fined.

The investigation lasted more than a year, but with apparently little to show for it. Researchers had charged that DeepMind’s handling of the records was likely excessive and illegal. The transfer of over a million records came to light after a New Scientist investigation. The ICO investigation vindicates the researchers.

The penalty—asking the Royal Free hospital to sign an “undertaking” promising to get its data protection affairs in order, which it has—is seen as a weak response by experts following the case. “I find it pretty remarkable that the ICO determined there were contraventions of multiple parts of the Data Protection Act … yet no formal enforcement action has resulted,” said Jon Baines, chair of the National Association of Data Protection Officers. “The ICO has the power to issue administrative fines of up to £500,000 ($647,000), yet chose not to exercise these powers here.”

Baines’ view is widely shared—even DeepMind found the ICO’s action mild, according to sources at the company. Industry welcomes the ruling. Martin Goodson, founder of an AI company that deals with health data, Evolution AI, said: “It’s a positive outcome. It’s a fairly light touch regulatory approach which I think is appropriate.” Goodson’s firm isn’t directly affected by the ruling since no health data is transferred to it by the medical organizations it works with.

Even though the ICO has taken a soft approach with today’s ruling, it’s likely to put hospitals and AI firms on notice. “The results of this investigation send a strong signal not only to other public bodies but also to companies that are increasingly seeking access to our data,” said Tomaso Falchetta, legal officer at the non-profit Privacy International.

The Royal Free said it accepts the ICO’s findings and is working to address its concerns. DeepMind said it was “wrong” to focus on tools for doctors instead of realizing its work needed to be held accountable to patients and the general public. DeepMind is also taking steps to fix things, including setting up a blockchain system that would provide an immutable log of how data was used.

But there are still a few loose ends to the DeepMind and Royal Free affair. The app DeepMind was developing, Streams, moved from clinical testing to being actually used in the hospital over the course of the ICO investigation. The commission says it still has concerns about this live deployment, and that today’s ruling doesn’t cover it. The Royal Free must also make the case for the data transfer as part of today’s ruling or face the consequences. “The crunch point may still be to come,” says Baines.


Read next: DeepMind’s access to UK health data shows how tech could outgun privacy laws

Read next: Google’s DeepMind has a plan for protecting private health data—from itself

home our picks popular latest obsessions search