Since AI touches on so many public values, the challenge is to shape innovation in a targeted way based on shared public values. In recent years the Rathenau Instituut has carried out a great deal of research into various aspects of values-driven innovation (see the reports Valuable Digitalisation, Industry seeking University, Living Labs in the Netherlands, and Gezondheid Centraal [Focus on Health]). In those reports, we make it clear that innovation policy is not only about developing new technological applications but also about the purpose which those applications serve, namely addressing the challenges facing society.
That is why values-driven innovation includes a focus on the development of suitable revenue models, appropriate legislation and regulations, and the social embedding of new applications. This approach to innovation recognises the complex nature of innovation at an early stage. We will discuss three ways in which innovation can be given shape on the basis of values.
Take legislation as the starting point
The avalanche of ethics codes may give the impression that the development of AI is taking place in a legal vacuum. That is not the case, of course. There is a lot of existing legislation with which the development and use of AI must also comply. This involves not only fundamental rights, including constitutional rights, but also specific legislation, such as the EU’s General Data Protection Regulation, or sector-specific legislation in fields such as healthcare or the transport market.
The recent history of digitalisation reveals that various IT platforms show little respect for the law. By dismissing existing legislation as obsolete, technology companies are attempting to evade various statutory responsibilities. This creates uncertainty regarding rights, obligations, and responsibilities (see also our report Eerlijk delen [A Fair Share]).
Courts are now clarifying matters in various legal cases, think of the ruling by the European Court of Justice on the responsibilities that Uber has. The Court ruled that Uber offers a transport service within the meaning of EU law. This means that the Member States are free to determine, at national level, the conditions subject to which that service may be provided. Regulatory bodies also play an important role in clarifying legal uncertainties.
Another factor is that the platforms often do not fit precisely into the existing legal categories. This leads to conceptual and policy uncertainty: is Facebook a social media platform, or a news company with the associated responsibilities? That is why it is often necessary to update existing legislation. The European Commission is currently preparing for revision of a large number of legal frameworks, including consumer law, copyright, audio-visual media, privacy, digital security, and competition law.
Innovation policy: make public values central
Greater attention has been paid in recent years to ethics in innovation policy. In June 2018, for example, the Dutch government published the national digitalisation strategy. This addresses numerous issues, including privacy, cybersecurity, and a fair data economy. The final section of the document concerns constitutional rights and ethics. The government is currently developing two “visions” on AI, a strategic AI action plan and a vision on constitutional rights and AI. The latter is a key component of the action plan.
It is important to see ethics not as a separate or final element of innovation programmes but as an integral part of them. The challenge is to make shared public values the basic principle. Examples of this can be found in other European countries. In the area of mobility, for example, the United Kingdom drew up cybersecurity standards for self-driving cars at an early stage, as a basis for their development. Germany has drawn up ethical guidelines for the development of self-driving cars. The Netherlands, too, can shape and direct innovation by imposing preconditions in the fields of privacy, cybersecurity, transparency and other basic principles, for example in areas of experimentation or when regulators grant (temporary) permits (“regulatory sandboxes”).
Actual practice: commit to technological and legal innovation at the same time
In actual practice, it turns out that technological innovation cannot be viewed separately from revenue models and regulations; these develop in tandem. For example, innovative cities such as Eindhoven and Amsterdam found themselves confronted by issues regarding the collection and use of sensor data within public space. Who has control of that data? What purposes can it be used for? How can a data monopoly be prevented? Amsterdam and Eindhoven therefore called for the development of national ground rules. A guide has since been produced.
In the healthcare context, too, a development can be identified in which innovation is embedded in a local care context involving doctors, patients, researchers, and developers. This benefits the quality of the new applications. The focus is no longer on the quantity of data but on its quality, and the higher purpose, namely improved health (see also our report Gezondheid Centraal [Focus on Health]).