By Marcelo Corrales Compagnucci, Janos Meszaros & Timo Minssen
This post is the second part in a two-part series about nudge theory, health data, and the U.K.’s National Data Opt-out System. You can read the first part here.
Governments are always actively trying to improve their health care systems, and the secondary use of health data is one way of reaching this goal effectively. The secondary use of health data involves the use of health care data collected for a new purpose, such as research and policy planning. This data is usually collected from hospitals and health care systems – large databases containing administrative, medical, health care, and personal data from patients.
In the age of smart health care, it is crucial to maintain the communication between care providers, patients, and medical devices. Asking for consent and opt-out default systems preserve individuals’ freedom of choice. However, some scholars have suggested that opt-outs may hamper the integrity of data sets, which might influence the safety and efficacy of smart health care services and devices. The EU General Data Protection Regulation (GDPR) provides several exemptions to solve this issue. If the data processing is necessary for reasons of public interest in the area of public health, such as high standard of quality and safety of healthcare and medicinal products and devices, the Article 9 (2) (i) of the GDPR allows EU Member States to enact laws allowing the use of this data.
In our previous blog post, we mentioned the tension between two different policy-making approaches:
- Behavioral insights: where governments may legitimately use various nudging techniques to influence and improve decision-making;
- Direct mandates: which suggest that government coercive intervention is more desirable. In particular, where people are likely to err and it is necessary to provide the means for improved decision making.
The GDPR generally requires consent for the processing of sensitive data. However, the GDPR also allows the processing of sensitive data for public health and scientific research, without consent, and Member State law also has the potential to add even more exemptions for scientific research. In the following, we will briefly outline two different regulatory approaches for the secondary use of health data, under the scope of the research exemption in Articles 9 (2) (j), 5 (e) and 89; Recitals 157, 158, 159, 160 and 162 of the GDPR: 1) The “National Data Opt-out” (ND opt-out) system in England, and; 2) the new Federal Data Protection Act (FDPA) in Germany.
With regard to the secondary use of health data in England, the U.K. government adopted the behavioral approach. Since May 2018, the U.K. government has launched the “National Data Opt-out” (ND opt-out) system with the hope of regaining public trust after a series of scandals concerning the access of health data by private companies. In the Google DeepMind patient data deal, for instance, Google’s artificial intelligence firm was allowed to access health data from over 1.6 million patients to develop an app which monitored kidney disease called “Streams.” The U.K. Information Commissioner’s investigation and research studies have suggested that the Google DeepMind deal had access to other kinds of sensitive data and failed to comply with data protection law.
However, and despite the ongoing U.S. litigation reported here, our research suggests that there is so far no hard evidence of significant changes in the ND opt-out, compared to the previous opt-out system — neither in the secondary use of data nor in the choices that patients can make. The only significant difference seems to be in the way that these options are communicated and framed to the patients. Most importantly, according to the new ND opt-out, the type-1 opt-out option, which is the only option that truly stops data from being shared outside of direct care, will be removed in 2020.
In Germany, however, there is no opt-out system provided to the citizens; thus, the government has not adopted any default rules system as a mode of inquiry. The secondary use of sensitive data was a highly debated issue during the drafting period of the German Federal Data Protection Act (FDPA). The FDPA benefits from the broad, opening clauses of the GDPR, such as Art. 24 (2) FDPA, by implementing several provisions for the secondary use of personal data. For instance, the data controllers in special cases may process the data for a purpose which is different than the original one. Section 27 FDPA also takes advantage of the exemptions under Art. 9 GDPR by permitting the processing of data for scientific or historical research, or statistical purposes without consent. These permissions are crucial in the case of health data since organizations might process it for a new purpose, e.g., scientific research, without the consent of data subjects.
In conclusion, the ND opt-out in England attempts to preserve freedom of choice for patients. However, the opt-out choices are going to be more limited in 2020, as the type-1 opt-out will disappear. In Germany, citizens do not have the choice to decide on the secondary use of their health data, since the government took full control on this issue. However, the lack of choice might be balanced with stricter rules and ethical reviews on research activities conducted by both public and private organizations.
It will be interesting to explore how these two regulatory approaches unfold in the future. We will examine these approaches and compare them with other legislation in our forthcoming research at the Center for Advanced Studies in Biomedical Innovation Law (CeBIL).