The GDPR is a stepping stone for ethical discussions
Even so, this new EU law deserves more praise. The GDPR has already raised the bar for ethical discussions about privacy. We notice that organisations are using the GDPR as a stepping stone for a serious dialogue about data. They are more likely now to ask themselves who their clients are, what they expect, and how their products and services can meet those expectations.
No more ‘Computer says no’
The GDPR will extend the rights of citizens and consumers. For example, someone who is significantly disadvantaged by automated decision-making can now question and fight that decision. ‘Computer says no’ is no longer an acceptable outcome, in other words.
This means that organisations must be more transparent in their use of artificial intelligence; they must be able to explain in clear and comprehensible terms how their algorithm works and how they reach their decisions. They must also introduce procedures that allow the decision-making process to be repeated without using the algorithm and with human intervention. So if a government agency refuses to pay out a benefit based on an algorithm, the relevant citizen can now force the agency to repeat the review processes without resorting to the algorithm.
A mouthful: data protection impact assessments
When an organisation begins processing a new set of personal data, it will be obliged in some cases to conduct a – wait for it – ‘data protection impact assessment’. That will be the case if crime data are being processed, for example, or if public spaces are being monitored. The procedure consists of a risk assessment and a list of risk mitigation measures.
The risk assessment itself is extremely valuable. In our practice, we also often see organisations assessing ethical factors and societal risks along with the necessary legal risks.
‘Privacy by design’ will become standard
The GDPR will not only change procedures, then, but also, and most importantly, the mindset of organisations. They are already taking on board the principle of ‘privacy by design’: designing products and services in a way that anticipates privacy-related problems. For example, they can minimise data collection from the very start, choose to anonymise or ‘pseudononymise’ personal data, and invest in encryption and other data security measures.
Privacy by design could become an all-encompassing design philosophy, with programmers learning to consider ethical frameworks and with public values underpinning design choices.
The commercial market will help
All this will lead not only to corporate social responsibility; products and services that reflect public values will also cause the market to pick up. People are increasingly insisting on privacy-friendly products and services. For example, chat apps that use end-to-end encryption, such as Signal, are growing in popularity and more and more people are turning their backs on Google and using alternative search engines. A concern for public values also gives companies a competitive advantage, in other words.
The arrival of the GDPR gives us every reason to grasp these opportunities. The successful 21st-century organisation is an organisation that prioritises the interests of society and public values.
By Iris Huis in ‘t Veld and Arnold Roosendaal of Privacy Company, a team of consultants who help businesses and governments comply with privacy rules.
Be sure to read the other articles in the Decent Digitisation series, and the related reports: