The NHS Trust that illegally gave Google DeepMind access to 1.6 million patient records has been let off lightly, lawyers say

Elizabeth Denham Town Hall Speech 21072016
Elizabeth Denham, information officer at the ICO. ICO

Data protection lawyers have questioned why an NHS Trust wasn’t given a more serious penalty for illegally transferring patient data to a Google-owned company.

The Information Commissioner, the UK data regulator, ruled on Monday that DeepMind’s deal with Royal Free London NHS Foundation Trust “failed to comply with data protection law” but there was no fine issued.

Adam Rose, a partner at Mishcon de Reya who focuses on data protection, life sciences and technology, told Business Insider on Wednesday that both parties have got off lightly given how many rules were broken. “In footballing terms, you could say they have got a yellow card,” said Rose.

In a blog post on Mishcon de Reya’s website, he added: “One might have thought that, given the seriousness of the breaches, the volume of sensitive personal data handed over to Google and the interest in getting this very issue right, the Commissioner would levy a fine on the Free close to the maximum permitted of £500,000.

“Instead, she has required the Free to give undertakings around its future performance. Whether this approach will act as a ‘nudge’ to others to perform better, or indicate that breaches of this scale aren’t taken as seriously as many may have expected, will be the test for future projects of this kind.”

Jon Baines, chair of the National Association of Data Protection Officers, is quoted by Quartz saying: “I find it pretty remarkable that the ICO determined there were contraventions of multiple parts of the Data Protection Act … yet no formal enforcement action has resulted.”

Mustafa Suleyman DeepMind
DeepMind cofounder Mustafa Suleyman heads up the DeepMind Health team. DeepMind

The illegal deal (which was replaced with a new one in November) was signed to help DeepMind test and develop a kidney monitoring app called Streams, which sends an alert to a clinician’s smartphone if a patient’s condition deteriorates. It also allows clinicians to view a patient’s medical records and see where patients are being looked after. It doesn’t use any of the AI that DeepMind is known for.

Through the agreement, DeepMind was able to see whether people are HIV-positive as well as details of drug overdoses and abortions. DeepMind insists that it has never shared patient data with parent company Alphabet.

In an 11-page letter to Royal Free CEO Sir David Sloman, Elizabeth Denham, information officer at the ICO, wrote that DeepMind breached the UK’s Data Protection Act (DPA) in a number of areas. Specifically, Principles 1, 3, 6, and 7 of the Act.

“My investigation has determined that the processing of approximately 1.6 million patients’ personal data by DeepMind Technologies Limited (‘DeepMind’) for the purpose of the clinical safety testing of the Streams application did not fully comply with the requirements of the Data Protection Act 1998 (the ‘Act),” wrote Elizabeth Denham, information officer at the ICO, in the letter.

DeepMind Streams app
DeepMind was given access to the patient data to develop a kidney monitoring app called Streams. DeepMind

The violated Data Protection Act Principles

Principle 1 of the DPA requires that data controllers inform individuals how their data will be processed and for what purposes. On this particular Principle, the ICO found that the Royal Free “did not provide an appropriate level of transparency to patients about the use of their personal data”.

Principle 3 of the DPA states that “personal data shall be adequate, relevant and not excessive in relation to the, purpose or purposes for which they are processed,” according to Rose’s blog. DeepMind was given access to 1.6 million partial patient records and the ICO ruled that this was neither necessary nor proportionate.

Principle 6 states that controllers should provide individuals with sufficient information about how their data is about to be processed. The Royal Free failed to do so.

Under Principle 7, the Royal Free should have put in place a contract with third party processors. A contract was signed in September 2015 but Rose writes that the ICO concluded it did not go far enough to ensure that “only the minimal possible data would be processed by DeepMind and that the processing would only be conducted for limited purposes”.

Rose added: “The use of machine learning and artificial intelligence in assessing medical risk and process will only continue to expand, and how the health sector interacts with that, in the light of the DPA, and the General Data Protection Regulation, will play a crucial role in shaping healthcare and care outcomes for the foreseeable future.”

NOW WATCH: Here’s how Google Maps knows when there is traffic