Facebook is in court yet again over the Cambridge Analytica scandal, and one of their lawyers made a most revealing assertion

 There is no invasion of privacy at all, because there is no privacy

Now on one level, this is literally true. Facebook’s lawyer went on to say that:

 Facebook was nothing more than a “digital town square” where users voluntarily give up their private information

A man staring through binoculars that have the Facebook logo on the lenses

The issue is a mismatch in expectations. Users have the option to disclose information as fully public, or variously restricted: only to their friends, or to members of certain groups. The fact that something is said in the public street does not mean that the user would be comfortable having it published in a newspaper, especially if they were whispering into a friend’s ear at the time.

Legally, Facebook may well be in the right (IANAL, nor do I play one on the Internet), but in terms of users’ expectations, they are undoubtedly in the wrong. However, for once I do not lay all the blame on Facebook.

Mechanisation and automation are rapidly subverting common-sense expectations in a number of fields, and consequences can be wide-reaching. Privacy is one obvious example, whether it is Facebook’s or Google’s analysis of our supposedly private conversations, or facial recognition in public places. 

For an example of the reaction to the deployment of these technologies, the city of San Francisco, generally expected to be an early adopter of technological solutions, recently banned the use of facial recognition technology. While the benefits for law enforcement of ubiquitous automated facial recognition are obvious, the adoption of this technology also subverts long-standing expectations of privacy – even in undoubtedly public spaces. While it is true that I can be seen and possibly recognised by anyone who is in the street at the same time as me, the human expectation is that I am not creating a permanent, searchable record of my presence in the street at that time, nor that such a record would be widely available.

To make the example concrete, let’s talk for a moment about numberplate recognition. Cars and other motor vehicles have number plates to make them recognisable, including for law enforcement purposes. As technology developed, automated reading of license plates became possible, and is now widely adopted for speed limit enforcement. Around here things have gone a step further, with average speeds measured over long distances.

An old-timey police car that looks very much like Doc Hudson from Cars

Who could object to enforcing the law?

The problem with automated enforcement is that it is only as good as it is programmed to be. It is true that hardly anybody breaks the speed limit on the monitored stretches of motorway any more – or at least, not more than once. However, there are also a number of negative consequences. Lane discipline has fallen entirely by the wayside since the automated systems were introduced, with slow vehicles cruising in the middle or even outside lanes, with empty lanes on the inside. The automated enforcement has also removed any pressure to consider what is an appropriate speed for the conditions, with many drivers continuing to drive at or near the speed limit even in weather or traffic conditions where that speed is totally unsafe. Finally, there is no recognition that, at 4am with nobody on the roads, there is no need to enforce the same speed limit that applies at rush hour.

Human-powered on-the-spot enforcement – the traffic cop flagging down individual motorists – had the option to modulate the law, turning a blind eye to safe speed and punishing driving that might be inside the speed limit but unsafe in other ways. Instead, automated enforcement is dumb (it is, after all, binary) and only considers the single metric it was designed to consider.

There are of course any number of problems with a human-powered approach as well; members of ethnic or social minorities all have stories involving the police looking for something – anything – to book them for. I’m a straight white cis-het guy, and still once managed to fall foul of the proverbial bored cops, who took my entire car apart looking for drugs (that weren’t there) and then left me by the side of the road to put everything back together. However, automated enforcement makes all of these problems worse.

Facial recognition has documented issues with accuracy when it comes to ethnic minorities and women – basically anyone but the white male programmers who created the systems. If police start relying on such systems, people are going to have serious difficulties trying to prove that they are not the person in the WANTED poster – because the computer says they are a match. And that’s if they don’t just get gunned down, of course.

It is notoriously hard to opt out of these systems when they are used for advertising, but when they are used for law enforcement, it becomes entirely impossible to opt out, as a London man found when he was arrested for covering his face during a facial recognition trial on public streets. A faulty system is even worse than a functional one, as its failure modes are unpredictable

Systems rely on data, and data storage is also problematic. I recently had to get a government-issued electronic ID. Normally this should be a simple online application, but I kept getting weird errors, so I went to the office with my (physical) ID instead. There, we realised that the problem was with my place of birth. I was born in what was then Strathclyde, but this is no longer an option in up-to-date systems, since the region was abolished in 1996. However, different databases were disagreeing, and we were unable to move forward. In the end, the official effectively helped me to lie to the computer, picking an acceptable jurisdiction in order to move forwards in the process – and thereby of course creating even more inaccuracies and inconsistency. So much for “the computer is always right”… Remember, kids: Garbage In, Garbage Out!

What, Me Worry?

The final argument comes down, as it always does with privacy, to the objection that “there’s nothing to fear if you haven’t done anything wrong”. Leaving aside the issues we just discussed around the possibility of running into problems even when you really haven’t done anything wrong, the issue is with the definition of “wrong”. Social change is often driven by movement in the grey areas of the law, as well as selective enforcement of those laws. First gay sex is criminalised, so underground gay communities spring up. Then attitudes change, but the laws are still on the books; they just aren’t enforced. Finally the law catches up. If algorithms actually are watching all of our activity and are able to infer when we might be doing something that’s frowned upon by some1, that changes the dynamic very significantly, in ways which we have not properly considered as a society.

And that’s without even considering where else these technologies might be applied, beyond our pleasant Western bubble. What about China, busy turning Xinjiang into an open-air prison for the Uyghur minority? Or “Saudi” Arabia, distributing smartphone apps to enable husbands to deny their wives permission to travel? 

Expectations of privacy are being subverted by scale and automation, without a real conversation about what that means. Advertisers and the government stick to the letter of the law, but there is no recognition of the material difference between surveillance that is human-powered, and what happens when the same surveillance is automated.

*** 

Photos by Glen Carrie and Bryan Hanson via Unsplash

  1.  And remember, the algorithms may not even be analysing your own data, which you carefully secured and locked down. They may have access to data for one of your friends or acquaintances, and then the algorithm spots a correlation in patterns of communication, and associates you with them. Congratulations, you now have a shadow profile. And what if you are just really unlucky in your choice of local boozer, so now the government thinks you are affiliated with the IRA offshoot du jour, when all you were after was a decent pint of Guinness?