Health at the centre

Responsible data sharing in the digital society

Health

Report

Downloads

medische data rapport 2 suiker testen met telefoon

For this report, the Rathenau Instituut has investigated examples of digital services that enable patients and healthy people to share data with a view to improving their health. We show that responsible and secure data sharing is best achieved by remaining small in scale and by focusing on what is truly necessary. This report gives government, the healthcare sector and politicians the tools they need to ensure that digital health data services are used in a way that is compatible with a ‘socially responsible digital society’.

The digitisation of health data creates opportunities for more personalised healthcare and prevention. When combined, different digital services make it possible to access, share and use electronic health data, including outside the healthcare domain. The public and political discussion no longer centres on the mere digitisation of patient records. A further aim is to activate people to work on improving their health using their own data. The expectation is that by controlling their data, people will be able to take charge of their healthcare.

At the same time, it remains to be seen whether this will lead to better healthcare advice, whether people will actually manage their health better, and whether it is even desirable for them to control more of their data. This report shows that responsible and secure data sharing is best achieved by remaining small in scale and by focusing on what is truly necessary. It gives government, the healthcare sector and policymakers the tools they need to ensure that digital health data services are used for the benefit of a ‘socially responsible digital society’. The quality of the data and of good and appropriate healthcare are at the centre here, with people being protected against the unwanted use of their data.

Previous research on personal health management

In May 2018, the Rathenau Instituut published the report Responsible digital health management. More data, more control? in response to the Processing of Personal Data in Healthcare (Additional Provisions) Act (WABVPZ) and the development and implementation of a system for accessing and sharing health data (Niezen & Verhoef, 2018). The WABVPZ is the follow-up of the Electronic Health Record (EHR) Act, whose introduction was blocked by the Dutch Senate in 2011. Particularly controversial was the mandatory connection to the National Health Data Switchboard (Landelijk Schakelpunt, LSP), as the Senate considered that it did not guarantee secure and protected data or data transfer. The new Act and development of the necessary technical infrastructure are meant to encourage people to take charge of their digital health data and allow them to take decisions about data sharing.

One of the main conclusions of the 2018 report is that stakeholders involved in developing the system of digital access and sharing of health data did not look closely enough at the changes that will be necessary in healthcare practice, at the different types of patients and their needs, at the role of healthcare practitioners in implementation, or at the use of digital data by third parties outside the healthcare system. The report also concludes that public values, such as autonomy and solidarity, are under pressure. People’s ability to take independent decisions about their health by accessing and sharing their health data depends on their being capable of interpreting what these health data actually say about their health. Not everyone can interpret such data correctly, and not everyone wants to make decisions on their own. One consequence of the speed with which digital personal health management is being introduced is a failure to consider whether and how it supports or, conversely, curtails people’s autonomy, and what should be done to ensure that autonomy.

Electronic data sharing services under scrutiny

The Rathenau Instituut’s report Health at the centre. Responsible data sharing in the digital society investigates digital services that make it possible for healthcare professionals, individuals and, potentially, third parties (insurers, companies and researchers) to share health data. Sharing in this way blurs the dividing line between the medical and non-medical domains. What does this mean for healthcare practice, and for the degree to which people can take and maintain control over their data and, consequently, manage their health?

We report on four digital services that are emerging in parallel: online portals in the mental health and addiction care sector; lifestyle and medical apps (health apps), personal health environments (PHEs, personal data vaults) and public platforms (collective databases).

We studied eleven cases in all, divided across the four categories of digital services. The case studies are based on a study of the literature (both academic and grey literature publications, including policy documents), forty semi-structured interviews, and conversations with experts and stakeholders involved in developing or facilitating the various digital health data services. Research pertaining to the case studies took place from October 2017 to October 2018. The first steps towards formulating possible actions were taken during a workshop in May 2018 attended by 19 professionals in the field.

Findings

Our research has turned up good practice examples in the development and use of digital health data services. In the mental health and addiction care sector, online portals are used to support shared decision-making by healthcare professionals, patients and, in some cases, their families. Online and face-to-face care are ‘blended’, and developers increasingly offer healthcare professionals and healthcare providers feedback on how the services are being used. Specific medical apps, such as MS Sherpa (which is still being trialled), help users to better manage their disease. The intention is to use self-learning software to make predictions about the course of a user’s illness or state of wellbeing in the near future, so that patients and healthcare professionals can anticipate accordingly.

Personal health management improves when healthcare professionals and patients consult about how to use the app. At present, there are no examples of how PHEs bearing the ‘MedicalMe’, or MedMij in Dutch, quality mark are being used, as the first PHEs have yet to be certified. We have noted, however, that the two active PHEs in our study (neither of which is MedMij-certified) are both being offered within a healthcare setting and focus on regional cooperation.

In these examples, we see that the responsible parties are cooperating successfully in living labs on using data to improve healthcare and health and to develop evidence-based interventions. In this context, the point is not (only) to collect as much data as possible but to ensure good data quality and meaningful analysis leading to better healthcare.

Our study also shows that the parallel emergence of the various services and their networked nature, combined with more data sharing and data linkage, is accelerating the effects of the digital transformation on healthcare, on individuals, and on society as a whole. People not only gain more control over their data but in fact also relinquish control. The monopolisation of health data by large companies is of particular concern because it skews the power relationship between commercial parties and patients even more than it already is.

It is also more difficult to monitor the quality of the data being shared and used and the transparency of the analyses, raising questions about the accuracy of advice and about who is liable if something goes wrong. In addition, most of the services currently available are being used by only part of the population, i.e. chronic patients and people in good health. This is a particularly sensitive issue in the complex field of healthcare because there is a risk of people being excluded.

Preferred citation:
Niezen, M.G.H., Edelenbosch, R., Van Bodegom, L. & Verhoef, P. (2019). Gezondheid centraal – Zorgvuldig data delen in de digitale samenleving. Den Haag: Rathenau Instituut

 

We have reached three conclusions. In each case, we discuss which actions can be taken in addition to existing initiatives in policymaking, research and healthcare practice to ensure that the digital transition in healthcare is based on responsible data sharing. This means that everyone has equal access to the services and that we consider the impact of the services on healthcare practice, society and public values. Only if the quality of the data is good, data transfer is protected and secure, and there is no pressure to share data, can digital sharing contribute to such social aims as good quality healthcare, personal health and sickness prevention.

1. There is a lack in frameworks governing the use of digital health data services and no coordination of such use, either in the medical domain itself or in its interaction with the non-medical domain.

Data sharing extends beyond the healthcare domain. With the various health data services becoming increasingly interlinked, health data will circulate outside the familiar doctor-patient relationship on an ever-widening scale within a network of public and private partners. So far, rules applicable within the medical domain (e.g. medical ethical reviews) are not being informed by rules outside that domain (e.g. the GDPR) or vice versa. It would be advantageous if they did inform each other, however, since non-medical data can also tell us something about our health.

  • Establish ownership of the various responsibilities, including liability in medical interventions, more explicitly in agreements

It is clear that no one ‘owns’ many of the constituent problems (interoperability, organisational obstacles, privacy, liability if something goes wrong), especially in the less regulated non-medical domain. Clarify existing agreements and allocate responsibilities, for example healthcare professionals’ liability when using data originating from their patients’ digital services, and the responsibility of individuals when sharing their data with third parties, including an explanation of what could happen if they are not careful about sharing.

  • Establish broad codes of conduct for the development of services, including services that lie outside the medical domain

Ensure that common (action-ethical) frameworks and forms of oversight within the medical domain can also be used in or adapted for the less regulated non-medical domain. For example, a code of conduct for developers and service providers, even those that make use of artificial intelligence, would extend the scope of responsibility and awareness beyond data security and privacy alone. Examples include the codes of conduct that the European Commission has already initiated with regard to disinformation and privacy in mhealth, and the Artificial Intelligence Impact Assessment (AIIA) recently launched by Electronic Commerce Platform Nederland and TNO.

  • Maximise learning from best practices in healthcare

Governance of healthcare digitalisation, a government task, should no longer focus on encouraging as much sharing of as much data as possible, but on recognising and implementing excellent initiatives. Organise a platform or other mechanism to identify best practice solutions both for the technology itself and for its practical implementation and evaluation.

2. There are not enough safeguards in the data chain, i.e. the processes of generating, accessing, sharing and using health data.

Trust mechanisms are underdeveloped in the data chain. This is about trusting ourselves; trusting our capacity to think and act when accessing, checking, interpreting and sharing (or consenting to share) our digital health data. We must trust that we are not alone in this, but can make the right decisions in cooperation with healthcare professionals and/or our loved ones. We must also be able to trust the quality and reliability of the services and the data that are shared.

  • ​​​​​​​Build on the concept of patient confidentiality and supplement it with technological citizenship

People must trust that they are in fact capable of taking charge of their own data. Develop the concept of ‘patient confidentiality’ such that it protects data not currently protected under the aegis of medical confidentiality, and promote technological citizenship by continuing to invest in digital skills, by involving the public in digital innovations and, more specifically, by establishing an authority or a fund that provides guidance.

  • ​​​​​​​Define precisely what shared decision-making entails

It is important to clarify who is responsible for initiating shared decision-making between healthcare professionals and patients about data components: the healthcare professional (and which one?), the individual, and/or an independent third party? The combination may differ depending on the healthcare context and service involved. In addition to specified consent[1], we should be investigating dynamic forms of consent such as those used in MIDATA.

  • ​​​​​​​Make safeguards ensuring the quality and reliability of data and data sharing transparent and put appropriate oversight mechanisms into place

Developers of services should be required to explain how they guarantee the quality and reliability of data and data sharing. This not only means that they should, for example, have the necessary CE Mark but also that they should provide explanations that are comprehensible to the user, for example about the medical standards that they have applied. There should be independent quality marks for every type of service. The AP and IGJ ‘watchdogs’ should cooperate, for example to exclude providers that do not have the MedMij label (or other quality mark for services other than PHEs).

3. There are limits to personal health management; equal access to healthcare and health are not sufficiently guaranteed.

There are threats to the voluntary nature of people’s control over their health data. To persuade people to share their data in support of healthcare for themselves and others (and to make it more affordable), we need more comprehensive safeguards addressing the voluntary nature of data sharing and the real benefits for personal health management.

  • ​​​​​​​A governance system must be established that will strike the right balance between the individual and the collective interest

Data solidarity may well erode the voluntary nature of public participation in digital health data services. Developers of services, healthcare providers, patient representatives, government and companies will have to work together on protecting and promoting autonomy, data sharing for the public benefit, and a solidarity-based healthcare system. The fund mentioned under 2a above can also play an important role here, encouraging people to share data but also seeing that they are compensated if something goes wrong.

  • ​​​​​​​​​​​​​​Never lose sight of the right to not be measured, analysed or coached and the right to meaningful human contact

People who are uninterested in digital healthcare services must also be able to depend on receiving good quality healthcare and on having equal access to healthcare. Healthcare providers and patient representatives must continue to stand up for these people, even if health insurers and government insist on more efficient and cost-effective healthcare.

Final remarks

Concerns about privacy and confusion about responsibilities prevented the introduction of a national EHR in the Netherlands. A new law and additional measures are bringing secure digital data sharing a step closer. This study shows that further steps are needed to ensure responsible digital sharing of people’s most sensitive data. This is particularly important in the light of recent revelations concerning the major commercial interests involved in medical data. In the past few years, we have seen personal data being used in a manner that erodes democracy and the rule of law.

The Netherlands is at the cutting edge worldwide in digital healthcare applications, with best practice examples being developed in cooperation with users, healthcare practitioners and researchers. Government should no longer focus on sharing as much data as possible but on encouraging and continuing to implement these best practice examples. The quality of the data, healthcare that respects human dignity and health itself are at the centre here, with people being protected against the unwanted use of their data.

Frequently Asked Questions

The digitisation of health data creates opportunities for more personalised healthcare and prevention. That is because doctors and other professionals can more easily access medical data, with the consent of the person in question. People can also collect their own health and lifestyle data and, for example, share it with their healthcare practitioner. When combined, different digital services make it possible to access, share and use electronic health data, including outside the healthcare domain.

In May 2018, the Rathenau Instituut published a report entitled Responsible digital health management. More data, more control? (Niezen & Verhoef, 2018) in which it concluded that giving people online access to their medical data does not automatically mean that everyone is willing and able to shoulder the associated responsibility. There must be a greater focus on such core values as autonomy and solidarity, and on developing frameworks and safety nets to protect people against being pressured by third parties to give them access their health data.

In our report Health at the centre. Responsible data sharing in the digital society, we observe that the public and political debate has moved beyond the digitisation of and access to medical records. People are being encouraged to use digital health data services because doing so gives them more control over their health data and subsequently allows them to take charge of their health. The Rathenau Instituut questions whether it is actually advisable for people to control their digital health data, whether they receive better health advice based on those data, and whether they really do manage their health better as a result.

The Health at the centre report examines the parallel emergence of digital services that allow individuals to access, share and use their health data. We analyse four different categories of services, both separately and in combination: online portals operated by healthcare institutions (that allow us to ‘view’ our medical records and give us access to supporting digital programmes), health apps (‘digital coaches’), personal health environments (PHEs, personal data vaults in which we can manage our own health data digitally from a single, comprehensive overview), and public platforms (collective online databases, in which we can share information and our own health data with others). Our investigation has given us a better understanding of how using digital health data services impacts healthcare, individuals and society as a whole.

Personal data management and data sharing were also possible in the past, for example when people obtained copies of their medical records. However, with the digitisation of health data the scale and scope of data sharing is increasing, with both positive and negative consequences for the relationship between people and patients.

We see best practice examples of shared decision-making between healthcare professionals, patients and, where necessary, loved ones and informal carers that support people in taking decisions about their treatment process and in sharing their health data. In blended care, digital and face-to-face healthcare are attuned to each other, so that patients who wish to do so can participate digitally in their healthcare. Howver, the integration of shared decision-making and blended care into healthcare practice must certainly not be taken for granted.

At the same time, we see health data circulating outside the familiar doctor-patient relationship on an ever-widening scale within a network of public and private partners. Digital health data services and data sharing have shifted the responsibilities of healthcare professionals, patients and developers, but these changes have not yet been surveyed or identified, causing worry among patients, healthcare practitioners and developers:

  • We are seeing a proliferation of digital tools designed to help policyholders, patients and consumers monitor their health and adjust their lifestyle behaviour. The apps differ considerably in terms of the quality of their tracking and e-coaching. How do patients know that they are using services and data of good quality? In addition, patients are more than just a data source, and they too want to benefit from sharing their data.
  • Healthcare professionals want more clarity about the confidentiality of data in medical records and the threat to that confidentiality when digital copies are removed from their ‘control’.
  • Developers and providers of digital services are conscious that they now bear more responsibility for interoperability, security and data protection, but they are not yet sure how to structure their services in a way that helps people to share their data responsibly.

The idea is that the different digital services help people to understand their health status. In particular, Personal Health Environments (PHEs) – to which people can add copies of their healthcare practitioner’s medical records and data that they track themselves with an app or wearable – will allow people to share data with research institutes or app publishers that provide personalised advice. Once individuals are in control of their health data, they may well become the point of contact not only for healthcare practitioners, but also for third parties.

Commercial parties as well as municipal authorities, the Employee Insurance Agency (UWV) and the Care Needs Assessment Centre (CIZ) may have an interest in the data accumulated in a PHE, a public platform or an app. Not everyone will be sufficiently capable of resisting ‘urgent’ requests for this information. A further risk is that people will not make informed decisions (or be unable to do so) about disclosing their data because they are afraid that a healthcare practitioner or authority will not be able to assist them properly otherwise.

We may also question whether everyone understands the real or potential implications of sharing their data with third parties. For example, what if an employer learns ‘prematurely’ that an employee is pregnant, or if someone is diagnosed (correctly or not) with an illness by a commercial screening service without being offered professional coaching? Another risk of data ending up outside the healthcare sector is that profiling will be used for unwanted advertising or to manipulate people’s behaviour.

The Rathenau Instituut has carried out many studies into the social, economic and ethical effects of digitalisation in recent years. Key findings of these studies that are consistent with the current findings on health data digitisation are:

  • Digitalisation leads to a looping effect in which the virtual environment steers the real world. Digitisation of patient data leads to changes in the healthcare process and alters the roles of healthcare professionals and patients.
  • The large-scale digitalisation of public services gets bogged down when attempts are made to standardise too many different services for the various users and members of the public at the same time. Experts who participated in the current study warn against designing a standardised record to document all the health data of the entire population.
  • It is wrong to assume that digitalisation gives users across the board more control over processes. Some users will actively use and manage their data, others will make passive use of their data, and still others will in fact lose control. This issue is even more sensitive in the complex field of healthcare because there is a risk of exclusion.
  • When digitalisation is aimed at linking as much data on as many people as possible, large-scale platforms emerge that weaken the position of individual users. That is also true in the healthcare domain.
  • Security risks are a growing threat in digitalisation. Not only is privacy under threat, but also system robustness, application continuity and public values. Cybersecurity is therefore an increasingly important criterion for responsible digitalisation, certainly in the healthcare sector.
  • The need to underpin sound professional care with research can all too easily lead to the unchecked transfer of patient data and biological material. The difference between diagnostic research, curiosity-driven research, clinical trials and other forms of health research is unclear to patients.

The report gives us several best practice examples of how best to share data and use digital health data services. But the case studies also teach us three lessons:

Lesson one: We need to clarify or redefine which party is responsible for what when it comes to data sharing, access to health data and quality of care. Only then will we be able to oversee the consequences of using digital data services in the healthcare sector.

The various health data services will be increasingly interlinked, causing health data to circulate outside the familiar doctor-patient relationship on an ever-widening scale within a network of public and private partners. Because non-medical data can also tell us something about our health, it would be advantageous for the rules that apply within and outside the medical domain to inform each other, for example with GDPR implementation informing medical-ethical reviews, and vice versa.

Lesson two: To inspire and maintain trust in responsible data sharing, we need to build on the safeguards within the health data chain, i.e. the processes of generating, accessing, sharing and using health data. There are not enough of these safeguards at the moment.

Possible safeguards:

  • Further elaboration of the concept of patient confidentiality, whereby copies of data from a medical record stored in an online portal or a PHE would be protected automatically, reducing pressure on individuals to share their data.
  • Continuing to support technological citizenship, for example by investing in the digital skills of the public and by involving the public in digital innovations.
  • Establishing an authority or fund to which people can turn for help in deciding about data sharing matters (for example by means of a dashboard that provides an overview and helps them understand what they control) and where they can seek redress if data are shared or used unlawfully or result in an incorrect diagnosis or change of behaviour.
  • Establishing a transparent overview of existing and new quality marks that offer an indication of the quality and reliability of the services.
  • Close collaboration on oversight between the Dutch Data Protection Authority and the Health and Youth Care Inspectorate so that unwelcome services that put the quality and reliability of data and data transfer at risk can also expect to be penalised.

Lesson three: To continue guaranteeing access to healthcare and health, it is important to realise that there are limits to personal health management.

Our ideal is to manage, share and combine digital health data, but we want too much and we want it too soon. The existing services are only evidence-based for part of the population, i.e. for chronic patients, for people in good health and for a healthcare context that supports the integration and improvement of digital services in the work and healthcare process. It is not data sharing but rather good healthcare that should be at the centre of the digital society. That means, for example, that services are used purposefully and in accordance with best practices.

We must also consider the effects of sharing health data on society as a whole. It is in any case clear that sharing data with third parties changes the balance of power within and outside the healthcare sector. Not only do we and our doctors know more and more about our bodies, but other parties – including commercial parties and local authorities – also know more and more about our health. A governance system must be established that will strike the right balance between the individual and the collective interest and continue making it possible for people to receive good quality healthcare, also in a non-digital form.