The policies surrounding immersive technologies are in a state of flux. We discuss a selection of European laws aimed at managing risks associated with immersive technologies. These include the General Data Protection Regulation, the Artificial Intelligence Act, and the Digital Services Act. In combination, these laws can limit the opportunities for influence and manipulation based on physical and behavioral data collected in XR. At the same time, there are also a number of policy gaps and ambiguities. For example, while all kinds of physical and behavioral data may be collected by XR providers if users consent, potentially very sensitive information can be derived from it. These policies do not fully cover the risk of 'doelverschuiving' (purpose shifting): information collected in one place can be used in another place, against the interests of the users. There is also uncertainty about the protection of neurodata.
Incentives are in place to create opportunities for Dutch and European businesses in the XR market. We discuss the investment from the Groeifonds for the Creative Industries Immersive Impact Coalition (CIIIC), the European Initiative on Virtual Worlds, and the Digital Markets Act. However, with investment in immersive technologies and wider adoption of these technologies in society, the risks also become more plausible.
We formulate a number of options for action for politicians and policymakers to mitigate the risks of immersive technologies. However, there are a number of inherent risks in these technologies that will remain when they are widely adopted. This has to do with the intimate data being collected. Once this data is available, it may be used for other purposes against the public interest.
Politicians will have to make a choice about some fundamental issues: where can immersive technologies help perpetuate and actualize public values (e.g., in therapeutic applications that have demonstrable health benefits), and where should these technologies not be applied at all because they affect public values too much (e.g., large-scale adoption of data-collecting XR devices in schools)? And are there certain types of data, such as neurodata and pupillary reflexes, that should not be collected at all because they tell so much about us and abuse is a realistic scenario? And to what extent is further hyper-personalization desirable in public spaces, or should certain domains remain XR-free?
Because immersive technologies have not yet seen a break-through on a large scale, policymakers and politicians have the opportunity to adjust the development and adoption of these technologies. Thus, the challenge for policymakers and politicians is to determine the ways in which the government wants to adjust the innovation dynamics surrounding immersive technologies, based on its duty to protect citizens' rights and public values.