calendar tag arrow download print
Skip to content

Do not only apply precaution afterwards

11 July 2023

Photo by Ben Kitching - Unsplash

Een vrouw beklimt mer voorzorgsmaatregelen een berg

With some new technological applications, scientific certainty about possible risks to humans, animals and the planet is difficult or impossible to obtain. Either because insufficient research has been done, or because the methods and theoretical models to interpret those risks are lacking. A four-year during research-consortium concludes that the principle should be used more proactively.

In short

  • Within the EU, the precautionary principle now mainly functions as a safety net for uncertain risks.
  • The high societal impact of technology today, however, makes it irresponsible to only think about risks and uncertainties after products have already appeared on the market.
  • This article argues for a more proactive use of the precautionary principle and shows how it can guide innovation development like a compass.

The essence of the precautionary principle is that governments should intervene if they have reasonable grounds for concern about the safety of new products. Based on these reasonable grounds for concern, they can for example provisionally ban a new substance in consumer products or medicines if, after its introduction, they see significantly more cases of illness that are most likely related to the properties of the substance. After all, citizens need to be confident that their safety, health and rights are adequately guaranteed.

Within the EU, the precautionary principle is often used by policymakers and legislators as a last resort, a kind of safety net. If new risk research on an existing technology raises legitimate concerns, governments can for example ban that technology or impose stricter rules on its use. 

Safety net alone is insufficient

As a safety net, using the precautionary principle is essentially reactive. But those who only react to reports of unsafety or environmental impacts after the technology has been introduced into practice are often too late. Adjusting the impact of a new technology is difficult when it has already been widely rolled out in society. The application is then already embedded in technical, economic and social structures.

A good example is plastic. Plastic is light, cheaper than metal, can be moulded into anything, but also creates a huge waste problem. Phasing out plastic use is difficult because so many production chains have become dependent on it. It might have been better if governments had acted here earlier. Based on the uncertainties that also existed years ago, they could, for instance, have encouraged emerging plastics companies to do more research on biodegradable plastics.

Biotechnology and geo-engineering

There is a growing urgency to adjust the design of new technology early, rather than mitigating the negative impacts of its use only after the fact. After all, human impact on the world through technology is also growing and creating new uncertain risks.

Nanotechnology allows small particles to be made for a variety of applications, but it is often unclear what effects they have when they enter our cells. Biotechnology makes it possible to modify the building blocks of life (DNA), but it is uncertain how modified organisms interact with other organisms. Developments in information science, neuroscience and behavioural science are also making humans more like controllable objects, which may have implications for their autonomy and privacy. Especially since digital technologies, such as artificial intelligence, are unpredictable. Geoengineering promises to counteract the effects of climate change, but its consequences for us and future generations are difficult to foresee.

The compass in practice

Examples of how to apply precaution as a compass are described in the final report of the RECIPES project. Briefly, these can be boiled down to two habits: looking around and looking ahead.

Looking around means more actively considering what effects technological inventions will have for different groups in society. Looking ahead means actively thinking about and anticipating the potential impact of a technology. These habits do not automatically become part of an innovation system. Governments such as the European Union can strengthen the focus on them by including related practices as hard conditions for research funding, for example.

The easiest way to do this is to make solving societal challenges the primary goal of research programmes, and to assess them accordingly. The chances of the outcome of such a program entailing unforeseen risks are lower if its purpose is explicitly aimed at solving societal problems from the outset. This approach is already partly reflected in the Horizon program of the European Union.

A second way to do this is by making integral and explicit considerations of both the expected benefits and drawbacks of innovations within different research programmes. This should also include considerations about uncertainties. For instance, to what extent does it make sense to put a lot of money into new forms of nuclear energy when solar energy technology seems to become increasingly efficient? Especially given the risks of nuclear power. To gain insight into this, policymakers will have to consult with experts and use scenarios that clarify which innovation paths are desirable and feasible, and which are not.

The government can encourage this by requiring stakeholders to be involved in identifying potential risks and ethical issues when funding a research programme. Future users of the technology can also play a role in this.

Room for criticism

A third way to do this is by organising more space for criticism within the innovation system. Researchers now often have no interest in being open about the uncertainties and uncertain risks within their own research. Encouraging open science, open access, science journalism and risk research, among others, can ensure that there is more space within the innovation system for uncovering uncertain risks. Funders can stipulate a research programme to also organise critical counter-research. This happened  in the NanoNext research program in the Netherlands.¹

Fourth, it is important for the innovation system to accommodate a diversity of knowledge. In the past, uncertain risks were often revealed late in the process because they did not fit within the accepted risk models. For example, a ban on three types of neonicotinoids came only after scientists and civil society organisations had demonstrated the flaws of previous risk assessment procedures.² Being open to new insights from ecology and risk science, for example, and analyzing the interests of those providing the risk data are also important here.

Civil society sorganisations and citizens

Finally, stronger and more flexible cooperation is needed between civil society organisations, policymakers, citizens and the research domain. Input from all these actors is needed to quickly identify new uncertain risks, devise and support innovative alternatives to them, and avoid unnecessary regulation.

Organising round tables on key themes - such as climate change - could help in this respect, if all the above perspectives are represented. Examples in the Netherlands, such as the Committee of Wise People (commissie van wijzen voor kennis en innovatie) on Knowledge and Innovation, established by the government some ten years ago, and, more recently, the National Growth Fund Committee (Nationaal Groeifonds), are a good start.³ ⁴ However, it is important that these committees include not only people with an optimistic economic view of technological innovation, but also people who bring up broader societal aspects and uncertainties.

If the European Commission and other governments want to take the role of technological innovation in solving societal challenges seriously, they will have to make targeted adjustments to current innovation systems through new laws and regulations. Taking into account the uncertainties surrounding the benefits and risks of new technology, and knowledge about the inherent limitations within science, are crucial for this. However, this will not happen automatically. It requires more active coordination from governments and targeted adjustments within the innovation system where the wisdom of the precautionary principle can act as a compass.

Related articles: