Published on 24.02.17 in Vol 19, No 2 (2017): February
The Effect of the General Data Protection Regulation on Medical Research
Background: The enactment of the General Data Protection Regulation (GDPR) will impact on European data science. Particular concerns relating to consent requirements that would severely restrict medical data research have been raised.
Objective: Our objective is to explain the changes in data protection laws that apply to medical research and to discuss their potential impact.
Methods: Analysis of ethicolegal requirements imposed by the GDPR.
Results: The GDPR makes the classification of pseudonymised data as personal data clearer, although it has not been entirely resolved. Biomedical research on personal data where consent has not been obtained must be of substantial public interest.
Conclusions: The GDPR introduces protections for data subjects that aim for consistency across the EU. The proposed changes will make little impact on biomedical data research.
J Med Internet Res 2017;19(2):e47
There have been significant developments in European Union (EU) data protection law recently that will have an impact on health care professionals, particularly those engaged in research and audit. The General Data Protection Regulation (GDPR) has replaced the current legislation and comes into full effect in 2018 . The implications for the handling of health care data of the GDPR will be discussed in this paper. Despite the recent referendum vote in the United Kingdom to leave the EU, the GDPR will continue to be relevant to the United Kingdom, whether this is due to cooperation in European projects or because the United Kingdom continues to be a member of the European Economic Area (EEA).
The Data Protection Directive
Currently the relevant law in the United Kingdom is the Data Protection Act 1998, which is the United Kingdom’s transposition of the Data Protection Directive (DPD). European directives are not directly enforceable, requiring member states to pass legislation to comply with their requirements. There are derogations (legal exemptions) for research, which in the case of the United Kingdom have been criticized for being too broad. The Local and Regional Development Planning Kantor report for the European Commission criticizes the United Kingdom for disregard of the limitations, stating that the Data Protection Act blatantly violates the Directive by adding “medical research” to the list of medical purposes . The DPD requires a “substantial public interest” for member states to add to the derogations for processing of sensitive personal data (Article 8.4).
Differences between EU member states can result in research ethics committees in United Kingdom denying permission for National Health Service (NHS) data to be transferred to other EU countries (the opposite might also be the case in some circumstances) . These differences have also contributed to the passage of the GDPR as part of the Digital Single Market strategy [ ].
The Law as It Will Be From 2018: The General Data Protection Regulation
The text of the GDPR has recently been agreed after a prolonged trilogue between the European Commission, Parliament, and the Council of Ministers . This legislation will replace the national transpositions of the DPD. Regulations are directly enforceable across the EU. The GDPR comes into full effect on May 25, 2018, although member states are permitted minor differences in interpretation (the European Court of Justice is the ultimate arbiter). This legislation has the potential to affect projects using research data banks and Big Data [ , ]. There had been concerns that a clause inserted by the European Parliament requiring specific consent would prevent significant long-term epidemiological research taking place in the future [ ], but this was rejected and the agreed text permits broad consent to “certain areas of research when in keeping with recognized ethical standards” (Recital 33) [ ]. Broad consent is not blanket or open consent [ ] although some commentators argue that blanket or open consent is acceptable for biobank and databank research as the risks are minimal and do not vary for different projects [ ]. Another possibility is consent to a form of governance [ ]. Open consent without any ongoing regulation or communication about proposed projects would be potentially problematic. Dynamic consent offers advantages for an engaged community of participants but might not be considered beneficial by some individuals [ ].
The derogations for research without consent have been expanded to specifically include medical research where “in the public interest” (Recital 51). How public interest will be defined has not been elaborated, but European jurisprudence demands member states satisfy a high threshold where human rights are involved (eg, a “pressing social need” ). This standard would not be required for the conduct of medical research using databanks, but it might exclude all commercial research for “me too” drug development (drugs that offer no advantages over drugs already on the market), arrangements that have no evidence of benefit sharing, or simply require that projects address issues of public importance regardless of the profits made [ ]. This requirement reflects public attitudes in the United Kingdom to the use of health care data, where there is resistance to use of public data for commercial ventures unless the research could not happen without commercial involvement [ , ].
Data protection law only applies to personal data—that is, data that does directly or can indirectly identify an individual [- ]. The simple deletion of name and address is usually insufficient to constitute anonymization (it has been demonstrated that the combination of 3 pieces of data could identify 87% of US residents: 5-digit zip code, birth date, and sex) [ ]. The United Kingdom Information Commissioner’s Office currently treats pseudonymized data as anonymous where it is used by a third party who does not possess the requisite key code. Truly anonymized data cannot be linked back to an individual (which means that verification of data is not possible by any means). Pseudonymized data typically has identifiers removed and replaced with a unique key code (there is also 2-way cryptography; 1-way cryptography is considered anonymized). This key code can be used to trace the data back to an individual, enabling any safety concerns to be acted upon and for data to be verified. This is the approach that the United Kingdom Care.data project on the use of NHS electronic health records for data research has been taking [ ]. The GDPR will require changes in practice, as it confirms in Recital 26 that pseudonymized data must be treated as personal data (in line with the previous Article 29 Working Party opinion) [ ]. That position results from the increased vulnerability of data subjects who could potentially be identified compared to the protection afforded them with true anonymisation—if the key code is hacked, then all the data can be linked to an individual once more.
Consent presumed by failure to opt-out, or change preticked boxes, will no longer be permitted (unless covered by the derogations)—consent will need to be by a “clear, affirmative action” (Article 4.11). These changes would have arguably made the abandoned Care.data project  illegal, despite the passage of enabling legislation that exempted general practitioners from the common law duty of confidentiality when fulfilling their contractual duties to pass on health care data. Care.data relied on an opt-out for legitimacy [ ]. The exercise of this opt-out was not straightforward. The numbers opting out far exceeded the estimates and the capacity for the Health and Social Care Information Centre (now NHS Digital) to process in a timely manner. The problems included omission of those who opted out from calls for NHS screening programs, even though this was not the intention of those exercising this right. NHS Digital currently relies on pseudonymization, which the GDPR states is categorized as a matter of law as personal data. It is not entirely clear whether or not third parties without access to the key code could treat pseudonymized data as anonymized (as is currently the case in the United Kingdom). Key codes are a potential vulnerability due to accidental or malicious disclosure, which is one of the justifications for pseudonymized data being classified as personal data. There are no clear indications that there are no future plans to use NHS patient data for research.
Dame Fiona Caldicott reviewed arrangements because of the widespread concerns related to consent , and her report led to the cancellation of the Care.data project [ ]. The particular issues that were identified include the lack of information about Care.data that made exercising an opt-out an opaque process, the inadequate mechanisms for opting, and the failure of protection for rights and access to the NHS for those who opt out.
The risk of re-identification in the future is impossible to quantify precisely because it cannot be predicted what information will become public . However, as with biobanks, the risks to individuals are lesser compared with studies of medical interventions [ ]. Therefore authorization by research ethics committees is acceptable practice, with the requirement that opt-outs be respected unless there are exceptional circumstances.
Although the GDPR comes into force in mid-2018, researchers need to prepare now for the changes it will bring to long-term epidemiological studies. In particular, the categorization of pseudonymized data as personal will require action in some jurisdictions such as the United Kingdom and Greece . The necessary accommodations will require an investment of resources, but this will hopefully ensure that subjects continue to have trust in the integrity of their health care data and the medical research community [ ]. The GDPR may still apply should the United Kingdom cease to become a member state of the EU either because the United Kingdom is a member of the EEA or because the United Kingdom retains these instruments as law at least for the short term [ ].
Although audit and research are treated differently in law, the boundaries between the 2 activities are blurred . Audit is directly relevant to the monitoring and improvement of quality of health care; therefore, it is included as a primary use of data—Recitals 52-54 and Article 9.2 (h) and (i) of the GDPR make this clear. Audit and health care management are a primary use of health care data, and research is a secondary use—that is, it is a use different from the originally declared purpose (although it is designated a compatible purpose within the GDPR but only for nonsensitive data). If an audit compares health care systems to discover which is most effective, this can also be categorized as research as the practices are not compared to a gold standard, and there is a hypothesis being generated or even tested by finding associations. The recent furor surrounding the Royal Free Trust project in conjunction with Google DeepMind illustrates the debate over the distinction of audit from research [ - ].
Dame Fiona Caldicott affirmed in her 2013 report on information governance that “The duty to share can be as important as the duty to protect patient confidentiality” .
Data sharing within the EU should not be obstructed because of differences in data protection law under the principles of the Digital Single Market and Article 1(2) of the Data Protection Directive. Data portability and data sharing is an issue with health care data , which the European Patients Smart Open Services (epSOS) project attempted to address [ ]. The GDPR addresses data portability under Article 20, stating that the data subject has the right to receive their data in an appropriate format without hindrance and for data to be transferred between data controllers where technically feasible. The Bundestag is currently considering an eHealth bill with the same aim of improving portability of data [ ]. This will facilitate the ability of patients to move between health care providers without unnecessary duplication of tests.
The Digital Single Market aims for improved data sharing across the EU, which will facilitate cross-border health care and research. Harmonization will be improved under the GDPR with a concomitant raising of standards for some countries, although there is still room for national differences according to the reasonable expectations of different publics. This advance makes cross-border projects more easily ethically justifiable and more feasible . The requirements for anonymization have not been changed, except to clarify that pseudonymized data must still be considered as personal data. The GDPR will facilitate medical research, except where it is research not considered in the public interest. In that case, more demanding requirements for anonymization will entail either true anonymization or consent. It is likely there will be more projects that require either consent or authorization, since many projects currently use pseudonymization. There is still an unresolved issue over third parties with access to pseudonymized data.
This work has been funded by AEGLE project, Horizon 2020 ICT/2014/1 grant.
Both authors contributed to the analysis of legal issues and the writing of the manuscript.
Conflicts of Interest
- General Data Protection Regulation 2016/679.: European Union; 2016. URL: http://eur-lex.europa.eu/eli/reg/2016/679/oj [accessed 2017-02-04] [WebCite Cache]
- LRDP Kantor, Centre for Public Reform. Comparative study on different approaches to new privacy challenges in particular in the light of technology developments. 2010. URL: http://ec.europa.eu/justice/policies/privacy/docs/studies/new_privacy_challenges/final_report_en.pdf [accessed 2017-02-04] [WebCite Cache]
- Veerus P, Lexchin J, Hemminki E. Legislative regulation and ethical governance of medical research in different European Union countries. J Med Ethics 2013 May 10;10:1. [CrossRef]
- Reform of EU data protection rules.: European Commission URL: http://ec.europa.eu/justice/data-protection/reform/index_en.htm [accessed 2017-02-04] [WebCite Cache]
- Statement by Vice-President Andrus Ansip at the press conference on the adoption of the Digital Single Market Strategy.: European Commission URL: http://europa.eu/rapid/press-release_SPEECH-15-4926_en.htm [accessed 2017-02-04] [WebCite Cache]
- Marr B. The 5 V's of Big Data Internet.: Data Science Central; 2015. URL: http://www.datasciencecentral.com/profiles/blogs/the-5-v-s-of-big-data-by-bernard-marr [accessed 2017-02-04] [WebCite Cache]
- Analysis: Research and the General Data Protection Regulation - 2012/0011(COD) Internet.: Wellcome Trust; 2016. URL: https://wellcome.ac.uk/sites/default/files/new-data-protection-regulation-key-clauses-wellcome-jul16.pdf [accessed 2017-02-04] [WebCite Cache]
- Stevens L. The proposed Data Protection Regulation and its potential impact on social sciences research in the UK. European Data Protection Law Review 2015;1(2):97-112.
- Simon C, L'heureux J, Murray J, Winokur P, Weiner G, Newbury E, et al. Active choice but not too active: public perspectives on biobank consent models. Genet Med 2011 Sep;13(9):821-831 [FREE Full text] [CrossRef] [Medline]
- Hofmann B. Broadening consent—and diluting ethics? J Med Ethics 2009;35:129. [CrossRef]
- Sheehan M. Can broad consent be informed consent? Public Health Ethics 2011;4(3):226-235. [CrossRef]
- Laurie G. Governing the spaces in-between: law and legitimacy in new health technologies. In: Flear ML, Farrell A, Hervey TK, Murphy T, editors. European Law and New Health Technologies. Oxford: Oxford University Press; 2013:193.
- Steinbekk K, Myskja B, Solberg B. Broad consent versus dynamic consent in biobank research: Is passive participation an ethical problem? Eur J Hum Genet 2013;21:897-902. [CrossRef]
- Handyside v United Kingdom.: European Court of Human Rights; 1976. URL: http://hudoc.echr.coe.int/eng [accessed 2017-02-15] [WebCite Cache]
- Haddow G, Laurie G, Cunningham-Burley S, Hunter K. Tackling community concerns about commercialisation and genetic research: A modest interdisciplinary proposal. Soc Sci Med 2007;64:272-282.
- Aitken M. SHIP public engagement: summary of focus groups.: Scottish Health Informatics Programme; 2011. URL: http://www.academia.edu/2702142/SHIP_Public_Engagement_Summary_of_Focus_Groups [accessed 2017-02-04] [WebCite Cache]
- Ipsos MORI. The one-way mirror: Public attitudes to commercial access to health data. London: Wellcome Trust; 2016. URL: https://www.ipsos-mori.com/researchpublications/publications/1803/Commercial-access-to-health-data.aspx [accessed 2017-02-04] [WebCite Cache]
- Opinion 4/2007 on the concept of personal data. Brussels: European Commission; 2007. URL: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2007/wp136_en.pdf [accessed 2017-02-04] [WebCite Cache]
- Grubb A. Breach of confidence: anonymised information. R. v. Department of Health ex parte Source Informatics Ltd. Med Law Rev 2000;8(1):115-120. [Medline]
- Common Services Agency v Scottish Information Commissioner. 2008. URL: https://www.publications.parliament.uk/pa/ld200708/ldjudgmt/jd080709/comm-1.htm [accessed 2017-02-04] [WebCite Cache]
- Sweeney L. Simple demographics often identify people uniquely. Pittsburgh: Carnegie Mellon University; 2000. URL: http://dataprivacylab.org/projects/identifiability/paper1.pdf [accessed 2017-02-04] [WebCite Cache]
- care.data hangs on engagement. 2015. URL: http://www.digitalhealth.net/2015/10/caldicott-care-data-hangs-on-engagement/ [accessed 2017-02-04] [WebCite Cache]
- NHS England to close care.data programme following Caldicott Review.: National Health Executive; 2016. URL: http://www.nationalhealthexecutive.com/Health-Care-News/nhs-england-to-close-caredata-programme-following-caldicott-review [accessed 2017-02-04] [WebCite Cache]
- Anonymisation: managing data protection risk code of practice. London: Information Commissioner's Office; 2012. URL: https://ico.org.uk/for-organisations/guide-to-data-protection/anonymisation/ [accessed 2017-02-05] [WebCite Cache]
- Laurie G, Jones K, Stevens L, Dobbs C. A review of evidence relating to harm resulting from uses of health and biomedical data.: Nuffield Council on Bioethics; 2014. URL: http://nuffieldbioethics.org/wp-content/uploads/A-Review-of-Evidence-Relating-to-Harms-Resulting-from-Uses-of-Health-and-Biomedical-Data-FINAL.pdf [accessed 2017-02-05] [WebCite Cache]
- Data protection and research in the European Union.: European Forum for Good Clinical Practice; 2015. URL: http://www.efgcp.eu/downloads/DP%20and%20Research%20in%20EU_HD_Final_06%2010%2015.pdf [accessed 2017-02-05] [WebCite Cache]
- Carter P, Laurie G, Dixon-Woods M. The social licence for research: why care.data ran into trouble. J Med Ethics 2015:1. [CrossRef]
- Mason R. Theresa May's “great repeal bill”: what's going to happen and when?.: The Guardian; 2016 Oct 02. URL: https://www.theguardian.com/politics/2016/oct/02/theresa-may-great-repeal-bill-eu-british-law [accessed 2017-02-05] [WebCite Cache]
- Wade D. Ethics, audit, and research: all shades of grey. BMJ 2005 Feb 26;330(7489):468-471 [FREE Full text] [CrossRef] [Medline]
- Hodson H. Google knows your ills. New Scientist 2016 May;230(3072):22-23. [CrossRef]
- Shah NR, Seger AC, Seger DL, Fiskio JM, Kuperman GJ, Blumenfeld B, et al. Improving acceptance of computerized prescribing alerts in ambulatory care. J Am Med Inform Assoc 2006;13(1):5-11 [FREE Full text] [CrossRef] [Medline]
- ICO probes Google DeepMind patient data-sharing deal with NHS Hospital Trust Internet.: ComputerWeekly.com; 2016. URL: http://www.computerweekly.com/news/450296175/ICO-probes-Google-DeepMind-patient-data-sharing-deal-with-NHS-Hospital-Trust [accessed 2017-02-05] [WebCite Cache]
- Caldicott F. Information: To share or not to share? The Information Governance Review. London: Department of Health; 2013. URL: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/192572/2900774_InfoGovernance_accv2.pdf [accessed 2017-02-05] [WebCite Cache]
- Kish L, Topol E. Unpatients—why patients should own their medical data. Nat Biotechnol 2015 Sep;33(9):921-924. [CrossRef] [Medline]
- Cross-border health project epSOS: what has it achieved?.: European Commission; 2014. URL: https://ec.europa.eu/digital-single-market/en/news/cross-border-health-project-epsos-what-has-it-achieved [accessed 2017-02-05] [WebCite Cache]
- Act on secure digital communication and applications in the health care system (E-Health Act).: Federal Ministry of Health; 2015. URL: http://www.bundesgesundheitsministerium.de/en/health/e-health-act.html [accessed 2017-02-05] [WebCite Cache]
- Dove ES, Townend D, Meslin EM, Bobrow M, Littler K, Nicol D, et al. Ethics review for international data-intensive research. Science 2016 Mar 25;351(6280):1399-1400 [FREE Full text] [CrossRef] [Medline]
|DPD: Data Protection Directive|
|EEA: European Economic Area|
|epSOS: European Patients Smart Open Services|
|EU: European Union|
|GDPR: General Data Protection Regulation|
|NHS: National Health Service|
Edited by G Eysenbach; submitted 07.12.16; peer-reviewed by E Afari-kumah, B Knoppers; comments to author 04.01.17; revised version received 19.01.17; accepted 21.01.17; published 24.02.17
©John Mark Michael Rumbold, Barbara Pierscionek. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 24.02.2017.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.