Our studies show that our most intimate rights, the rights that the European Union declared as fundamental, the rights that were adopted by the UN, are at stake. These new technologies affect our autonomy when taking decisions for us; for instance in determining what news items we see. When they take over decisions in our workspace, they can lead to deterioration of our professional decision making skills. They might lead to exclusion and discrimination in the marketplace when services are offered to some and not to others. Our rights as consumers to buy what is good for us and access services without paying with our data. In this way they affect our individual rights and also our collective rights as citizens: the right to be heard and voice our opinions and to take part in public life. When police use data for predictive policing. In fact these technologies change many relationships fundamentally such as those between workers and employers, between patients and doctors, between citizens and government, and in this way affect the existing rights we defined in these domains.
So what do we need to do?
First of all we need to look at these effects for what they are. Digital technologies and data are not magically going to fulfill our basic needs, even though this is what we often hear. On the contrary, a lot of misuse is possible. So we need conceptual clarity that the new players who collect our data, combine them and use them, and those who design and own these new technologies are also very much responsible for our wellbeing and are not allowed to violate our rights. This conceptual clarity also points to what laws or rules apply and what agencies are to supervise their behaviour. Just like the ruling of the Court of Justice of the European Union helped to show that Uber does have responsibilities towards its drivers and passengers and towards road safety. This clarification is needed first of all. It does not always mean we need to design new rules.
New rights and rules
In analysing our use cases we found that we needed two additional rights to satisfy our basic needs in the data society, in addition to our existing rights - the right not to be measured in certain situations and the right to meaningful human contact in certain important situations. To give an example, a care robot can help a person to stay at home for longer, but in some situations it can also be dehumanising to use one. A person should have the right to speak to a doctor or nurse to discuss certain decisions, for instance. In some cases there would be a case for political decision making and sometimes it is about personal decisions. The introduction of these technologies does take a lot of effort to rethink how to incorporate these new possibilities in a way that they work for good of all, for society as a whole and in our personal lives. We will need additional rules in some domains. But it starts with all actors taking responsibility and taking their duty of taking care seriously.