DeepMind obtained 1.6 million patient records from Britain’s National Health Service using a mechanism that was without an “appropriate legal basis,” according to a leaked letter from the UK’s top data privacy adviser, Sky News reported today. The letter has been verified as authentic by its source.
The letter was written by Fiona Caldicott, the National Data Guardian, to Stephen Powis, the medical director at London’s Royal Free Hospital, as part of an investigation conducted by the UK’s privacy regulator, the Information Commissioner’s Office. Caldicott tells Powis that the way the records were transferred from the hospital to DeepMind, Google’s artificial intelligence unit, did not comply with a legal basis for sharing such data, called “implied consent.”
The data was used in DeepMind’s testing of the Streams app, which alerts clinicians to patients who need attention, and doesn’t use any artificial intelligence techniques. The agreement governing the data transfer was superseded by a new deal in November 2016. The new deal, and the current deployment of the Streams app, are not under investigation.
In order for “implied consent” to hold, the data must be shared for the purposes of “direct care,” for instance, if a nurse shares information about a patient with another nurse when he is ending his shift. The test is whether a patient had a “reasonable expectation” that this information would be shared. In Caldicott’s view, DeepMind’s possession of the patient records failed the test. ”My considered opinion therefore remains that it would not have been within the reasonable expectation of patients that their records would have been shared for this purpose,” she wrote.
Caldicott doesn’t have final say over the legality of the handling of the health data by the Royal Free or DeepMind. That decision rests on the privacy regulator, the ICO, whose investigation into the matter is “close to conclusion,” according to a statement given to Quartz in March. It didn’t respond to a request for comment on the Caldicott letter. The ICO has the power to levy fines of up to £500,000 on organizations that break data protection laws.
The leaked letter is the first serious knock against DeepMind concerning its handling of private health data. Its clinical lead, Dominic King, told Sky News that it “could have done better” in making sure the public is “really informed” about how its data is used. A new page published on its website reiterates the point: “We should have announced our plans for DeepMind Health before our first hospital partnership. We should also have done more to engage with patients and the public at that time.”
In a statement to Quartz, DeepMind said: “The data used to provide the app has always been strictly controlled by the Royal Free and has never been used for commercial purposes or combined with Google products, services or ads – and never will be.”
Data protection experts and campaigners are vindicated. Jon Baines, chair of the UK’s National Association of Data Protection Officers, said he was “not at all surprised” by Caldicott’s finding. Phil Booth, coordinator at data privacy campaigners Med Confidential, said the publication of the letter comes at a fitting time for the NHS, which has been disrupted by global ransomware attacks. “While at the moment the NHS has real work to do to defend our data from malicious attacks,” he said, “[Google] is one of the largest information companies in the world, just coming in and thinking it can do what it wants with 1.6 million patients’ data.”
The Royal Free, in a statement, said it used patient data because of its “safety-first” approach to developing Streams. “Real patient data is routinely used in the NHS to check new systems are working properly before turning them fully live. No responsible hospital would ever deploy a system that hadn’t been thoroughly tested. The NHS remained in full control of all patient data throughout,” the Royal Free’s statement said.
The wrangling over the legality of DeepMind’s deal with the Royal Free has dragged on since New Scientist revealed the agreement last April. The UK’s privacy regulator will play a pivotal role in the saga when it finally concludes its investigation and publishes its findings.
Update (May 16): DeepMind contacted Quartz to clarify that the data was used during testing the Streams app, and the issues raised by Caldicott’s letter concern this testing phase, not the app’s current use of data.