By Jurriën Hamer and Linda Kool, researchers at the Rathenau Instituut. Be sure to read the other articles in the Decent Digitisation series.
We started this blog series last year in September. We called in experts from a wide variety of disciplines to show how we can bring decency to our digital society and keep it there. We threw down a gauntlet and saw that, in the main, government institutions, civil society organisations and researchers rose to the challenge. They shared their solutions in 17 seperate blogs. In this concluding blog, we look at where we now stand.
Over the course of the year, we divided the blog series into various topics, for example ‘Ethical IT professionals’ and ‘Digital child-rearing’. The insights generated by the series transcend their specific context, however. Those insights can be described in terms of four virtues that can help us deal decently with digital technology:
- transparency and
Together, these virtues reveal how politicians, policymakers and IT professionals can uphold public values such as fairness and autonomy in a digitising world. We examine each one below.
Personalisation: not everyone wants the same thing
The blogs repeatedly raise the point that people can’t all be lumped into the same category. For example, the National Ombudsman of the Netherlands comments that many people lack digital skills and lose their way in the government’s digital infrastructure. At the same time, there are large groups of people who have no trouble filing their tax returns or booking their wedding at city hall online. The message is that service providers must be sensitive to the differences between people and develop services that will allow for those differences. Or, as ICTU writes: ‘Maybe there’s no such thing as digital illiteracy. At most, there are “people-illiterate” systems.’
Digital personalisation is therefore a must, but it will require adaptability within digital systems themselves. Computer scientist Birna van Riemsdijk, for example, advocates digital pills that can share data in various different ways. After all, sometimes only the patient needs to know the data; in other cases, for example when a patient has dementia, it makes more sense to share the data with informal care-givers and doctors too. The next step is to talk to users about how they would like to use digital technology. That will also help build support for digital innovation. Nictiz’s blog, for example, shows that patients can only take charge of their own health when they receive personalised information and advice, and when companies involve them in developing e-health applications.
Personalisation also requires closer coordination between digital systems. In his blog, researcher Jaap-Henk Hoepman argues in favour of open systems, so that users are not locked into the technology of a specific provider. We should, for example, be able to send an iMessage to a WhatsApp account. Hoepman criticises the status quo, which is the very opposite: companies isolate their products from those of other providers and resist the introduction of uniform standards. That is bad for users’ freedom of choice and thus for their ability to find a service that best suits their needs.
Modesty: know the limits of digital technology
To personalise their services, organisations must let go of the notion of digital absolutism. Not everything should or can be arranged digitally. Sometimes it makes more sense to reverse a decision to operate a wholly digital environment and leave an analogue channel open, as the UWV Employee Insurance Agency has done. Whether the issue is benefits, municipal bylaws or a child’s living environment, the human world is rich and complex and not easily captured in algorithms.
That is why e-Law professor Simone van der Hof warns us that we should not lose sight of important human values when we consider the impact of digitisation on children. It is not in children’s best interests to protect them at all costs, but neither should they be given complete and utter freedom. Only a nuanced approach can offer children a safe, private environment in which they are allowed to make mistakes too.
Or as Sheila Jasanoff puts it: humility is essential. Because let’s be honest: the digitisation projects of the past ten years haven’t all been winners. Of course there have been successes in medical diagnostics and the digitisation of cars, but for each and every success we can also point out a crisis – and successes often give rise to new risks.
Just think of the many major IT projects that ended up being more expensive than anticipated. Digitisation is seldom easy and involves much more than a simple, straightforward efficiency operation. That is something that organisations should realise from the very start. It is therefore laudable that the Netherlands Police insists on linking experiments with algorithms that advise on policing to the scrupulous protection of democratic rights.
Transparency: watch out for an algorithmic ‘black box’
Transparency is a key requirement for personalisation and the targeted use of digitisation: people must know enough about technology to say what they actually want, and policymakers need the right information to recognise opportunities and threats. Unfortunately, it is precisely transparency that sometimes goes missing.
For example, Marlies van Eck writes that automated chain decisions by government are anything but transparent for the public or even civil servants. They do not understand how decisions about benefits or tax assessments come about and are powerless to set things straight in individual cases. That undermines the public’s legal protections.
Medialab SETUP and Amnesty International Netherlands are also worried about the algorithmic ‘black box’ that takes in data and spits out decisions. Algorithms are coded by fallible human beings, who can make programming mistakes or input substandard datasets. Algorithms that have not proved themselves beyond a doubt are nevertheless being used in all sorts of ways, for example in employee recruitment procedures. The lack of transparency is also a concern for democracy. When all is said and done, the Netherlands is country where politics is meant to respond to the worries and wishes of citizens – but without proper information, how can we adequately express those wishes in the first place?
The challenge, then, is to educate the public and policymakers and offer them a better understanding of how algorithms work. That is precisely where technology itself can play a constructive role, by creating applications that explain things like automated chain decisions in an entertaining and comprehensible manner. But once again, we must respect the limits of the possible. Digital technology is growing increasingly complex, certainly when it comes to deep learning systems that use a huge number of variables in their calculations. Even programmers do not always understand these systems. That's why it isn’t enough to call on the public, politicians and civil servants to learn the skills needed to control technological innovation; we also need to ensure that the right parties are charged with the right responsibilities.
Responsibility: dare to take the plunge
The blogs offer numerous suggestions for how to do this. To begin with, Linda Kool and Frans Stafleu argue that as a group, programmers – like doctors and lawyers – practise a profession that impacts society, one that requires sound ethical judgement. Programmers need to acknowledge their special responsibility and understand how their products and services are changing the way people live.
The argument put forward by the Royal Netherlands Association of Information Professionals (KNVI) is entirely in line with this idea. They too believe that information professionals need to bear in mind fundamental rights and freedoms and public values; perhaps even more importantly, their workplace should give them the opportunity to ask ethical questions and to alter plans where necessary.
Other parties also have certain special responsibilities. In the educational context, Justine Pardoen zeroes in on teachers: they must show leadership in guiding their pupils through the digital world and teaching them how to respect others and navigate safely online. Kennisnet offers a broader view; it recommends supporting schools with extracurricular instruction, auxiliary programmes and research.
The way ahead is, in any event, clear: decide who is responsible for what. Only then will society move forward. The recent introduction of the EU’s General Data Protection Regulation makes this clear. Privacy Company writes enthusiastically that many companies and other organisations are using the GDPR as a stepping stone for a serious dialogue about data collection and processing. This offers proof that a growing number of people are aware of the challenges of digitisation and courageous enough to acknowledge their responsibilities in that regard.
Decent Digitisation? We’ve only just begun
The Rathenau Instituut is pleased with this intriguing blog series. Our authors have shared inspiring insights with us and helped us to see the practical side of vital ideals. There is no doubt that we need their inspiration. Because although the Netherlands is more aware of the implications of digitisation than it was five years ago, has a better grasp of the ethical issues involved, and has shown more willingness to take responsibility, many questions remain unanswered. The virtues discussed above offer us all a good place to start, but they need to be incorporated into real solutions, such as technical applications, codes of conduct or statutory frameworks.
We will be keeping a very close eye on developments as they unfold. And while this particular blog series has come to an end, the Rathenau Instituut will continue publishing on this topic and asking stakeholders in society what they think. Decent digitisation? We’ve only just begun.
By Jurriën Hamer and Linda Kool, researchers at the Rathenau Instituut.
Be sure to read the other articles in the Decent Digitisation series, and the related reports: