If you’re in the UK, your health data is already shared within the NHS and, maybe, with initiatives like Google’s which brings its own privacy issues. Here’s a broader discussion on healthcare privacy…:
This article is part of “Health Care 2024,” a survey-driven series of online debates in which POLITICO will explore how the European Union can best tackle health policy.
Big data has the potential to provide personalized treatment, help researchers tackle the latest diseases and better anticipate epidemics. But it also requires patients to entrust governments and companies with vast troves of personal data.
In this installment of Health Care 2024 — a series of symposiums asking leading experts to weigh in on the health care priorities for the next European Commission — POLITICO asks: Can the EU gain benefits from sharing health data without weakening privacy — and if so, how?
Always seek consent
Jennifer Goldsack is executive director of the Digital Medicine Society (DiMe).
The digital era of medicine offers huge benefits to patients and society. Big data can accelerate research, improve health and lower costs. However, its arrival in the field of health care comes with new risks and challenges, in particular for privacy, that our industry is poorly equipped to manage.
As vast amounts of data are becoming increasingly available, guaranteeing that health data can be stripped of all information indicating the identity of the contributor — “data de-identification” in the parlance of the field — will be nearly impossible. To ensure privacy, individuals must maintain control over the extent, timing and circumstances of how their information is shared and used. We must improve how we seek consent from the individual citizens who are sources of health data and how we inform them about the potential consequences.
We must recognize that the sharing of health data is no longer restricted to the clinic. New industries and actors — including app developers, wearable technology manufacturers and social media platforms — are collecting and storing health data. This sometimes occurs without an individual’s knowledge, and often without anything resembling informed consent.
To protect privacy, we must make sure our safeguards keep pace with technology. Appropriate regulation is certainly one strategy for achieving this goal. Equally importantly, we must engage new experts —technology manufacturers, data scientists, security researchers, educators and cyber- and medical- ethicists — in our efforts.
Earn patient trust
Phil Booth is coordinator of medConfidential.
Strong privacy rules are a prerequisite for research and medical advances; they are not a barrier. If patients don’t trust that their data will be used responsibly, they won’t share it with anyone.
Patients routinely trust researchers with data on the basis that research in the public interest will be conducted safely, published openly and advance the public good. No legitimate researcher would want to use data from patients who do not wish their medical records to be used. That would not only be unethical and unprofessional, it would be utterly wrong — though, for those who simply don’t care about such things, it might still be highly profitable.
We must guard against pressure from vested interests for weaker protections to generate private gains. If patients cannot trust that what they say to their doctor will remain confidential, they will stop sharing data. Should that come to pass, individual patients who choose not to share information will put their own health at risk, and the harm to public health will far outweigh whatever benefits these privacy violations will offer the few.
Make data available
Duane Schulthess is managing director of Vital Transformation.
The problem is actually very simple: Big data is useless without access to data. Period. And in the EU that’s an especially big challenge.
In the U.S. data can be shared or commercialized if it is de-identified and anonymous. This has created a robust market for data-driven research. Even small amounts of data, such as for the CAR-T gene therapy used to treat cancer, can help governments make decisions on pricing and access. Meanwhile, China, a country of 1 billion people, is moving toward a national registry. Even if a condition is extremely rare — say, one in 1 million — that means a Chinese database could yield 1,000 potential research subjects.
Europe must realize that it is possible for privacy concerns to hamper research and cripple industry. R&D will move where data access is available.
And while patient concerns are at the center of the EU debate, they’re not the only factor holding back access to data. Europe should have at its disposal the most robust longitudinal health datasets in the world. It doesn’t, often due to localized politics and the Balkanization of data ownership for petty political gains. As a result, we don’t have patient-level datasets large enough to do the type of research that is increasingly needed.
Provided the right safeguards are in place, it is clearly possible to reap the benefits of sharing health data without weakening privacy. Cross-border sharing within the EU has the potential to accelerate medical research, deliver on the promise of personalized medicine, unlock European innovation in AI-based health care, prevent the spread of disease, better manage population health, and enable health care system reform. The Commission’s Recommendation on a European Electronic Health Record (EHR) exchange format is clearly a major step in the right direction.
While there still exists some reticence among EU member countries to share EHRs, consumers are already attuned to the possibilities. Many already seek help from specialists outside their geographical health care system, and for informed decision making those specialists need to have the full picture.
The interoperability and openness of data through standardized interfaces and APIs (Application Programming Interfaces) is an irreversible trend that Philips recognizes and fully endorses. The ability to innovate must not be restricted to those with the deepest pockets. However, the devil will be in the detail. To boost innovation in health care and AI, the EU needs to follow through with concrete measures to not only enable and standardize access to the health data contained in its public health care systems, but also to regulate it — regulation that needs to be at EU level and solidly based on ethical and societal values that lie at the heart of Europe.
Protect hospitals from cyberattacks
Stefaan Van Hoornick is a senior sales engineer for Benelux at Trend Micro.
Efficient data sharing is one of the key reasons for rising life expectancy across the world. However, as hospitals becoming increasingly connected to the internet, they are also becoming more vulnerable to cyberattacks — putting critical patient data and medical records at risk of exposure.
The 2017 WannaCry cyberattack, which endangered lives in hospitals as doctors were forced to cancel appointments and surgeries, was a wake-up call. It highlighted the risks posed by medical devices that are connected to the internet without proper protection. Stolen data could be monetized in multiple ways, including blackmail, industrial espionage and identity theft.
Hospitals should be extremely sensitive to these risks. Our research shows that a large number of hospitals are inadequately protected. Hospital IT teams must be well trained and made aware of the risks. They should consider technical solutions such as network segmentation, breach detection systems and anti-phishing and anti-malware software.
Give patients control
Virginie Bros-Facer is scientific director of EURORDIS — Rare Diseases Europe.
Patients with rare diseases know better than anyone the value of sharing data in the context of research. Data sharing can provide a greater evidence base for improving clinical outcomes, supporting the development of drugs and devices. It speeds up the diagnostic process, improves its accuracy and consequently reduces health costs.
But that doesn’t mean patients don’t want to control who has access to their data. We recently conducted the first multi-country survey on rare disease data sharing and protection, soliciting the views of more than 2,000 respondents from 66 countries representing 600 diseases. While practically all patients with rare diseases and patient representatives are supportive of data-sharing initiatives to foster research and improve health care, 80 percent also want full or near-to-full control over the data they share.
Those in charge of data-sharing initiatives must develop and implement robust standards to ensure secure, ethical and responsible data sharing. They must also put in place safeguards to make sure the data they collect is adequately protected. Involving actors trusted by patients, such as advocacy organizations, patient organizations and non-profit organizations, as well as health care professionals, would also be a step in the right direction.
Measures such as these will help reassure patients with rare diseases that they will not put their personal information at risk if they participate in data-sharing initiatives. By ensuring that patients can express their preferences and have access to research outcomes, we can boost innovation and provide hope for the whole community.
Keep an eye on big tech
Estelle Massé is senior policy analyst at Access Now.
Tech companies are moving into health care across the EU, often in public-private partnerships with national health services. The U.K.’s National Health Service has just announced it will team up with Amazon to allow elderly people and persons with disabilities to access diagnosis information through the tech company’s home assistant Alexa. The announcement comes days after it was confirmed that Alexa listens in constantly to its owners’ conversations and makes them available to Amazon employees. The privacy consequences of turning a home spying device into a health assistant are vast as Alexa knows nothing about the concept of patient confidentiality.
Past experience shows that companies are failing to protect data. In 2015, the U.K. Information Commissioner’s Office, the country’s privacy watchdog, found that Google’s DeepMind and hospitals made an illegal deal to process the health data of 1.6 million patients. Vulnerable patients are nudged to entrust companies with their most private information, but these companies have done very little to be worthy of that trust.
Saturated public health services are under pressure across the EU. But the EU must not hastily accept whatever solutions companies offer without guarantee for people’s rights. Privacy is not a luxury, and these programs may also have discriminatory effect on access to public health. So why should the most vulnerable patients give away their privacy rights?
Faciliate data exchanges
Loubna Bouarfa is founder and CEO of OKRA.
Big data — and artificial intelligence in particular — has the potential to make big improvements in health care. For example, at OKRA we develop AI systems that learn from patterns from similar patients to predict the right treatment for individual patients on a large scale. AI can also help physicians identify early signs of specific diseases, triage patients for screening, refer them to the right specialist and to identify the right treatment — saving costs and lives.
But AI is limited by the data made available to it. And although it’s important to regulate health data exchange this must enable and provide incentives for the digitalization of health data across Europe. There is unfortunately no progress without risks. However by minimizing risks through a sandboxing approach — for instance testing AI diagnoses on patients who are also being evaluated independently by doctors — we can increase the pace of innovation to save lives.
Exchanging health data should be made compulsory for all EU member states, and not just for the reimbursement of providers as is the case today. Data must be made available in a format that can be easily used by AI systems and should include information from electronic health records, imaging, claims, academic studies and clinical registries. The exchangeable format should also include outcome-related information that can be used to train AI systems.
To ensure privacy, patient data should always be anonymized and de-identified prior to any exchange.