Samenvatting
The digitisation of health data creates opportunities for more personalised healthcare and prevention. When combined, different digital services make it possible to access, share and use electronic health data, including outside the healthcare domain. The public and political discussion no longer centres on the mere digitisation of patient records. A further aim is to activate people to work on improving their health using their own data. The expectation is that by controlling their data, people will be able to take charge of their healthcare.
At the same time, it remains to be seen whether this will lead to better healthcare advice, whether people will actually manage their health better, and whether it is even desirable for them to control more of their data. This report shows that responsible and secure data sharing is best achieved by remaining small in scale and by focusing on what is truly necessary. It gives government, the healthcare sector and policymakers the tools they need to ensure that digital health data services are used for the benefit of a ‘socially responsible digital society’. The quality of the data and of good and appropriate healthcare are at the centre here, with people being protected against the unwanted use of their data.
Previous research on personal health management
In May 2018, the Rathenau Instituut published the report Responsible digital health management. More data, more control? in response to the Processing of Personal Data in Healthcare (Additional Provisions) Act (WABVPZ) and the development and implementation of a system for accessing and sharing health data (Niezen & Verhoef, 2018). The WABVPZ is the follow-up of the Electronic Health Record (EHR) Act, whose introduction was blocked by the Dutch Senate in 2011. Particularly controversial was the mandatory connection to the National Health Data Switchboard (Landelijk Schakelpunt, LSP), as the Senate considered that it did not guarantee secure and protected data or data transfer. The new Act and development of the necessary technical infrastructure are meant to encourage people to take charge of their digital health data and allow them to take decisions about data sharing.
One of the main conclusions of the 2018 report is that stakeholders involved in developing the system of digital access and sharing of health data did not look closely enough at the changes that will be necessary in healthcare practice, at the different types of patients and their needs, at the role of healthcare practitioners in implementation, or at the use of digital data by third parties outside the healthcare system. The report also concludes that public values, such as autonomy and solidarity, are under pressure. People’s ability to take independent decisions about their health by accessing and sharing their health data depends on their being capable of interpreting what these health data actually say about their health. Not everyone can interpret such data correctly, and not everyone wants to make decisions on their own. One consequence of the speed with which digital personal health management is being introduced is a failure to consider whether and how it supports or, conversely, curtails people’s autonomy, and what should be done to ensure that autonomy.
Electronic data sharing services under scrutiny
The Rathenau Instituut’s report Health at the centre. Responsible data sharing in the digital society investigates digital services that make it possible for healthcare professionals, individuals and, potentially, third parties (insurers, companies and researchers) to share health data. Sharing in this way blurs the dividing line between the medical and non-medical domains. What does this mean for healthcare practice, and for the degree to which people can take and maintain control over their data and, consequently, manage their health?
We report on four digital services that are emerging in parallel: online portals in the mental health and addiction care sector; lifestyle and medical apps (health apps), personal health environments (PHEs, personal data vaults) and public platforms (collective databases).
We studied eleven cases in all, divided across the four categories of digital services. The case studies are based on a study of the literature (both academic and grey literature publications, including policy documents), forty semi-structured interviews, and conversations with experts and stakeholders involved in developing or facilitating the various digital health data services. Research pertaining to the case studies took place from October 2017 to October 2018. The first steps towards formulating possible actions were taken during a workshop in May 2018 attended by 19 professionals in the field.
Findings
Our research has turned up good practice examples in the development and use of digital health data services. In the mental health and addiction care sector, online portals are used to support shared decision-making by healthcare professionals, patients and, in some cases, their families. Online and face-to-face care are ‘blended’, and developers increasingly offer healthcare professionals and healthcare providers feedback on how the services are being used. Specific medical apps, such as MS Sherpa (which is still being trialled), help users to better manage their disease. The intention is to use self-learning software to make predictions about the course of a user’s illness or state of wellbeing in the near future, so that patients and healthcare professionals can anticipate accordingly.
Personal health management improves when healthcare professionals and patients consult about how to use the app. At present, there are no examples of how PHEs bearing the ‘MedicalMe’, or MedMij in Dutch, quality mark are being used, as the first PHEs have yet to be certified. We have noted, however, that the two active PHEs in our study (neither of which is MedMij-certified) are both being offered within a healthcare setting and focus on regional cooperation.
In these examples, we see that the responsible parties are cooperating successfully in living labs on using data to improve healthcare and health and to develop evidence-based interventions. In this context, the point is not (only) to collect as much data as possible but to ensure good data quality and meaningful analysis leading to better healthcare.
Our study also shows that the parallel emergence of the various services and their networked nature, combined with more data sharing and data linkage, is accelerating the effects of the digital transformation on healthcare, on individuals, and on society as a whole. People not only gain more control over their data but in fact also relinquish control. The monopolisation of health data by large companies is of particular concern because it skews the power relationship between commercial parties and patients even more than it already is.
It is also more difficult to monitor the quality of the data being shared and used and the transparency of the analyses, raising questions about the accuracy of advice and about who is liable if something goes wrong. In addition, most of the services currently available are being used by only part of the population, i.e. chronic patients and people in good health. This is a particularly sensitive issue in the complex field of healthcare because there is a risk of people being excluded.
Preferred citation:
Niezen, M.G.H., Edelenbosch, R., Van Bodegom, L. & Verhoef, P. (2019). Gezondheid centraal – Zorgvuldig data delen in de digitale samenleving. Den Haag: Rathenau Instituut
Conclusions and possible actions
We have reached three conclusions. In each case, we discuss which actions can be taken in addition to existing initiatives in policymaking, research and healthcare practice to ensure that the digital transition in healthcare is based on responsible data sharing. This means that everyone has equal access to the services and that we consider the impact of the services on healthcare practice, society and public values. Only if the quality of the data is good, data transfer is protected and secure, and there is no pressure to share data, can digital sharing contribute to such social aims as good quality healthcare, personal health and sickness prevention.
1. There is a lack in frameworks governing the use of digital health data services and no coordination of such use, either in the medical domain itself or in its interaction with the non-medical domain.
Data sharing extends beyond the healthcare domain. With the various health data services becoming increasingly interlinked, health data will circulate outside the familiar doctor-patient relationship on an ever-widening scale within a network of public and private partners. So far, rules applicable within the medical domain (e.g. medical ethical reviews) are not being informed by rules outside that domain (e.g. the GDPR) or vice versa. It would be advantageous if they did inform each other, however, since non-medical data can also tell us something about our health.
- Establish ownership of the various responsibilities, including liability in medical interventions, more explicitly in agreements
It is clear that no one ‘owns’ many of the constituent problems (interoperability, organisational obstacles, privacy, liability if something goes wrong), especially in the less regulated non-medical domain. Clarify existing agreements and allocate responsibilities, for example healthcare professionals’ liability when using data originating from their patients’ digital services, and the responsibility of individuals when sharing their data with third parties, including an explanation of what could happen if they are not careful about sharing.
- Establish broad codes of conduct for the development of services, including services that lie outside the medical domain
Ensure that common (action-ethical) frameworks and forms of oversight within the medical domain can also be used in or adapted for the less regulated non-medical domain. For example, a code of conduct for developers and service providers, even those that make use of artificial intelligence, would extend the scope of responsibility and awareness beyond data security and privacy alone. Examples include the codes of conduct that the European Commission has already initiated with regard to disinformation and privacy in mhealth, and the Artificial Intelligence Impact Assessment (AIIA) recently launched by Electronic Commerce Platform Nederland and TNO.
- Maximise learning from best practices in healthcare
Governance of healthcare digitalisation, a government task, should no longer focus on encouraging as much sharing of as much data as possible, but on recognising and implementing excellent initiatives. Organise a platform or other mechanism to identify best practice solutions both for the technology itself and for its practical implementation and evaluation.
2. There are not enough safeguards in the data chain, i.e. the processes of generating, accessing, sharing and using health data.
Trust mechanisms are underdeveloped in the data chain. This is about trusting ourselves; trusting our capacity to think and act when accessing, checking, interpreting and sharing (or consenting to share) our digital health data. We must trust that we are not alone in this, but can make the right decisions in cooperation with healthcare professionals and/or our loved ones. We must also be able to trust the quality and reliability of the services and the data that are shared.
- Build on the concept of patient confidentiality and supplement it with technological citizenship
People must trust that they are in fact capable of taking charge of their own data. Develop the concept of ‘patient confidentiality’ such that it protects data not currently protected under the aegis of medical confidentiality, and promote technological citizenship by continuing to invest in digital skills, by involving the public in digital innovations and, more specifically, by establishing an authority or a fund that provides guidance.
- Define precisely what shared decision-making entails
It is important to clarify who is responsible for initiating shared decision-making between healthcare professionals and patients about data components: the healthcare professional (and which one?), the individual, and/or an independent third party? The combination may differ depending on the healthcare context and service involved. In addition to specified consent[1], we should be investigating dynamic forms of consent such as those used in MIDATA.
- Make safeguards ensuring the quality and reliability of data and data sharing transparent and put appropriate oversight mechanisms into place
Developers of services should be required to explain how they guarantee the quality and reliability of data and data sharing. This not only means that they should, for example, have the necessary CE Mark but also that they should provide explanations that are comprehensible to the user, for example about the medical standards that they have applied. There should be independent quality marks for every type of service. The AP and IGJ ‘watchdogs’ should cooperate, for example to exclude providers that do not have the MedMij label (or other quality mark for services other than PHEs).
3. There are limits to personal health management; equal access to healthcare and health are not sufficiently guaranteed.
There are threats to the voluntary nature of people’s control over their health data. To persuade people to share their data in support of healthcare for themselves and others (and to make it more affordable), we need more comprehensive safeguards addressing the voluntary nature of data sharing and the real benefits for personal health management.
- A governance system must be established that will strike the right balance between the individual and the collective interest
Data solidarity may well erode the voluntary nature of public participation in digital health data services. Developers of services, healthcare providers, patient representatives, government and companies will have to work together on protecting and promoting autonomy, data sharing for the public benefit, and a solidarity-based healthcare system. The fund mentioned under 2a above can also play an important role here, encouraging people to share data but also seeing that they are compensated if something goes wrong.
- Never lose sight of the right to not be measured, analysed or coached and the right to meaningful human contact
People who are uninterested in digital healthcare services must also be able to depend on receiving good quality healthcare and on having equal access to healthcare. Healthcare providers and patient representatives must continue to stand up for these people, even if health insurers and government insist on more efficient and cost-effective healthcare.
Final remarks
Concerns about privacy and confusion about responsibilities prevented the introduction of a national EHR in the Netherlands. A new law and additional measures are bringing secure digital data sharing a step closer. This study shows that further steps are needed to ensure responsible digital sharing of people’s most sensitive data. This is particularly important in the light of recent revelations concerning the major commercial interests involved in medical data. In the past few years, we have seen personal data being used in a manner that erodes democracy and the rule of law.
The Netherlands is at the cutting edge worldwide in digital healthcare applications, with best practice examples being developed in cooperation with users, healthcare practitioners and researchers. Government should no longer focus on sharing as much data as possible but on encouraging and continuing to implement these best practice examples. The quality of the data, healthcare that respects human dignity and health itself are at the centre here, with people being protected against the unwanted use of their data.