We have reached three conclusions. In each case, we discuss which actions can be taken in addition to existing initiatives in policymaking, research and healthcare practice to ensure that the digital transition in healthcare is based on responsible data sharing. This means that everyone has equal access to the services and that we consider the impact of the services on healthcare practice, society and public values. Only if the quality of the data is good, data transfer is protected and secure, and there is no pressure to share data, can digital sharing contribute to such social aims as good quality healthcare, personal health and sickness prevention.
1. There is a lack in frameworks governing the use of digital health data services and no coordination of such use, either in the medical domain itself or in its interaction with the non-medical domain.
Data sharing extends beyond the healthcare domain. With the various health data services becoming increasingly interlinked, health data will circulate outside the familiar doctor-patient relationship on an ever-widening scale within a network of public and private partners. So far, rules applicable within the medical domain (e.g. medical ethical reviews) are not being informed by rules outside that domain (e.g. the GDPR) or vice versa. It would be advantageous if they did inform each other, however, since non-medical data can also tell us something about our health.
- Establish ownership of the various responsibilities, including liability in medical interventions, more explicitly in agreements
It is clear that no one ‘owns’ many of the constituent problems (interoperability, organisational obstacles, privacy, liability if something goes wrong), especially in the less regulated non-medical domain. Clarify existing agreements and allocate responsibilities, for example healthcare professionals’ liability when using data originating from their patients’ digital services, and the responsibility of individuals when sharing their data with third parties, including an explanation of what could happen if they are not careful about sharing.
- Establish broad codes of conduct for the development of services, including services that lie outside the medical domain
Ensure that common (action-ethical) frameworks and forms of oversight within the medical domain can also be used in or adapted for the less regulated non-medical domain. For example, a code of conduct for developers and service providers, even those that make use of artificial intelligence, would extend the scope of responsibility and awareness beyond data security and privacy alone. Examples include the codes of conduct that the European Commission has already initiated with regard to disinformation and privacy in mhealth, and the Artificial Intelligence Impact Assessment (AIIA) recently launched by Electronic Commerce Platform Nederland and TNO.
- Maximise learning from best practices in healthcare
Governance of healthcare digitalisation, a government task, should no longer focus on encouraging as much sharing of as much data as possible, but on recognising and implementing excellent initiatives. Organise a platform or other mechanism to identify best practice solutions both for the technology itself and for its practical implementation and evaluation.
2. There are not enough safeguards in the data chain, i.e. the processes of generating, accessing, sharing and using health data.
Trust mechanisms are underdeveloped in the data chain. This is about trusting ourselves; trusting our capacity to think and act when accessing, checking, interpreting and sharing (or consenting to share) our digital health data. We must trust that we are not alone in this, but can make the right decisions in cooperation with healthcare professionals and/or our loved ones. We must also be able to trust the quality and reliability of the services and the data that are shared.
- Build on the concept of patient confidentiality and supplement it with technological citizenship
People must trust that they are in fact capable of taking charge of their own data. Develop the concept of ‘patient confidentiality’ such that it protects data not currently protected under the aegis of medical confidentiality, and promote technological citizenship by continuing to invest in digital skills, by involving the public in digital innovations and, more specifically, by establishing an authority or a fund that provides guidance.
- Define precisely what shared decision-making entails
It is important to clarify who is responsible for initiating shared decision-making between healthcare professionals and patients about data components: the healthcare professional (and which one?), the individual, and/or an independent third party? The combination may differ depending on the healthcare context and service involved. In addition to specified consent, we should be investigating dynamic forms of consent such as those used in MIDATA.
- Make safeguards ensuring the quality and reliability of data and data sharing transparent and put appropriate oversight mechanisms into place
Developers of services should be required to explain how they guarantee the quality and reliability of data and data sharing. This not only means that they should, for example, have the necessary CE Mark but also that they should provide explanations that are comprehensible to the user, for example about the medical standards that they have applied. There should be independent quality marks for every type of service. The AP and IGJ ‘watchdogs’ should cooperate, for example to exclude providers that do not have the MedMij label (or other quality mark for services other than PHEs).
3. There are limits to personal health management; equal access to healthcare and health are not sufficiently guaranteed.
There are threats to the voluntary nature of people’s control over their health data. To persuade people to share their data in support of healthcare for themselves and others (and to make it more affordable), we need more comprehensive safeguards addressing the voluntary nature of data sharing and the real benefits for personal health management.
- A governance system must be established that will strike the right balance between the individual and the collective interest
Data solidarity may well erode the voluntary nature of public participation in digital health data services. Developers of services, healthcare providers, patient representatives, government and companies will have to work together on protecting and promoting autonomy, data sharing for the public benefit, and a solidarity-based healthcare system. The fund mentioned under 2a above can also play an important role here, encouraging people to share data but also seeing that they are compensated if something goes wrong.
- Never lose sight of the right to not be measured, analysed or coached and the right to meaningful human contact
People who are uninterested in digital healthcare services must also be able to depend on receiving good quality healthcare and on having equal access to healthcare. Healthcare providers and patient representatives must continue to stand up for these people, even if health insurers and government insist on more efficient and cost-effective healthcare.
Concerns about privacy and confusion about responsibilities prevented the introduction of a national EHR in the Netherlands. A new law and additional measures are bringing secure digital data sharing a step closer. This study shows that further steps are needed to ensure responsible digital sharing of people’s most sensitive data. This is particularly important in the light of recent revelations concerning the major commercial interests involved in medical data. In the past few years, we have seen personal data being used in a manner that erodes democracy and the rule of law.
The Netherlands is at the cutting edge worldwide in digital healthcare applications, with best practice examples being developed in cooperation with users, healthcare practitioners and researchers. Government should no longer focus on sharing as much data as possible but on encouraging and continuing to implement these best practice examples. The quality of the data, healthcare that respects human dignity and health itself are at the centre here, with people being protected against the unwanted use of their data.