University of Groningen: Mobile phones continue to cause privacy issues

As a result of popular smartphone games like Pokémon GO, companies and government bodies have begun ‘crowdsourcing’ data on public spaces. Using location data, camera images or short surveys, information is collected from a large number of participants on topics such as traffic jams and areas where people feel safe or unsafe. Tech companies and the police then pool this information and use it for various purposes, but individuals may struggle to protect themselves against any potential negative consequences of these actions.

The role of smartphones in society is only getting bigger. Anyone can use their camera or be caught on camera at any given time. ‘This makes smartphone cameras more invasive than shop cameras or police cameras’, explains Ritsema van Eck. ‘Those are in fixed places and you can avoid them to a certain degree. But you cannot get away from smartphones, those are everywhere. Even though we may not intend to, our smartphones transmit lots of data even from the most remote places on earth.’

Rights now mainly focus on the individual
The insights gained from smartphone data usually apply to groups of people, such as a neighbourhood. And that’s where things get tricky – because the right to privacy only protects the individual. Ritsema van Eck: ‘The application of traditional human rights can be very difficult, as they focus on the individual and cannot be applied in the same way to privacy violations of entire streets or neighbourhoods. As a result, groups may be negatively impacted. Or a person may be stigmatized because they are part of that group, although not one individual is singled out. As a result, there would appear to be no legal issues. It’s a kind of no-man’s-land.’

Nothing to hide?
What’s wrong with filming private citizens if they have nothing to hide? This is a frequently heard counter-argument. ‘First of all, everyone has biases’, explains Ritsema van Eck. ‘People decide to film certain things but not other things. This means that the data is not objective. And also: in India, for example, neighbourhoods are classified as unsafe for women based on reports from citizens. Sounds good, you might say. But it also puts a stigma on residents in that specific area. Caution is therefore required, because profiling and discrimination are just around the corner.’

Profiling, or identifying possible suspects or places by applying group attributes using algorithms, is a hot topic. Ritsema van Eck explains that this is basically happening through the collection of smartphone data. ‘People supply data on locations and all of the information combined leads to a profile of a location that may or may not be entirely accurate. But: as long as individuals are not singled out, nobody appears to care. However, that does not mean that these actions are right or even harmless. I advocate for privacy rights for groups or streets to limit the disastrous impact of this type of crowd-sourced data.’