Private Big Brothers are arriving


For many decades I've had an ongoing debate with my friend David Brin over the ideas in his book The Transparent Society where he ponders what happens when cameras and surveillance technology become so cheap it's impossible to stop them from being everywhere.

While I and my colleagues at the EFF have worked to reduce government and corporate surveillance of our lives, at the back of my mind I have had a fear of what happens when groups of private citizens create surveillance systems. While we can debate whether the government can put up cameras on every corner, we can't stop private homeowners from having cameras on their own land which video the street in front of their house, which is a public space.

I noticed the launch last month of a company called Flock which wants to provide automatic licence plate readers to neighbourhoods. They will track every car going in and out of a neighbourhood. They will know (and forget about) the cars of residents, and will also know about the cars of regular visitors to the neighbourhood. If you've paid them their fee, and you get a break-in, you can get a list of all unusual cars that were in the area during the crime, and you can hand it to the police.

Certainly that seems legal, and it's not hard to see that neighbours would like it. They keep their privacy (presuming the promise of not recording known resident cars is kept) and only "outsiders" are tracked. I can even see wanting this info myself after a theft I had last year from my car. While it might not solve crimes, it would certainly add to the evidence to convict a suspect.

Instead, the question around this is, what if everybody does it, and things like it? We're not far from adding face recognition to these camera systems, so video is kept of all unknown people. The result is a world where you're a bit more secure at your own house, but you're under massive surveillance everywhere else. It is better that the data are provided only when a crime is reported, rather than having the police operate the system, but there are many countries where this technology will be run by police and spies. And it's hard to say it will be impossible for the police to get access to the data even when no homeowner wants it. But in reality, police will always be able to convince a homeowner to want it.

We then get a surveillance tragedy of the commons. What seems good for every group that does it sneaks Orwell's world in through the back door.

Vigilant Solutions

I should not mention Flock without pointing out that there is a much more developed threat in licence plate recognition from companies such as Vigilant Solutions. They have put up a large network of cameras and sell the data to police. I write about Flock not because it's as big a threat to privacy as Vigilant is today, but because the alternate business model of selling to private individuals makes things so different. On the one hand, it's better that it's not going directly to the police. On the other hand, it creates the tragedy of the commons I described -- it makes sense for any one neighbourhood to deploy something like this, but creates Big Brother if we all do. It's harder to figure legal challenges to this, while the legal challenges for Vigilant are more obvious, though not necessarily easy.


Seems to me all those self-driving test cars are already a rolling surveillance network. When I read in Google's blog that their cars track and record vehicles out to 100 yards, my first thought was that insurance companies would love to have that data. If they don't already have license plate and facial recognition, it would seem trivial to add. How many of those cars would be needed to watch everyone on the road all the time? 10 percent? Same goes for your delivery robots, they will see a lot of stuff even at their slow pace. Data that can be monetized will be monetized.

Cameras are showing up everywhere. Fortunately LIDAR does not really have the resolution to identify individuals (though it can possibly do gait recognition.) Cameras are another story. Private robocars probably won't be sending data up to databases, but fleet robocars could.

Yes, after robocars get up to decent penetration, they will observe most road accidents with human drivers, and the police might want that.

The main way to stop that will be for the public, or legislators, to make rules that stop the data from being collected. The cars will keep their logs for testing purposes for many years to come, but there can be rules about blurring faces and people, and not remembering licence plates for very long. And definitely not uploading them to police as a matter of course.

Does it really help if one prevents the government, or anyone else, from collecting such data? The standard line is "more data can lead to more abuse". On the other hand, though, a government evil enough to use such data against its own citizens would have no qualms harming them without having that data. Presumably no-one can reasonably object if such data prevent crimes or at least aid in convicting criminals.

As long as one is doing nothing illegal, is there harm in the government, or indeed anyone, knowing one's whereabouts? Apart from (where prostitution is legal") "I was in the brothel and I don't want my wife to know" and other similar situations, is there really a case for banning information?

The Scandinavian countries have public information by default. Some things which are required by law to be public there could lead to punishment if made public elsewhere. These countries are not exactly bad in terms of human rights, quality of life, etc---quite the opposite; they usually top the lists.

Is this a case, like mandatory vaccinations, where the "default" libertarian position is not the right position? As Asimov had a character say, never let your morals prevent you from doing what is right.

The privacy community regularly succeeds (and of course also regularly fails) to put checks and balances on abuses of data about the people. It helps.

The standard line is actually "more data will lead to more abuse" in that it's very rare to find data that does not get repurposed later.

Did you seriously suggest that if one is innocent, why should one have something to hide?

Well, I did give one example when one might want to hide information even though it is legal. There are others, of course.

However, we are talking here about things which anyone standing on a street corner can see.

It's just that "if you're innocent you have nothing to hide" is the most common false phrase of those against privacy. It even has a wikipedia page. Every time you've seen the government or society punish, shame, ostracize or discourage legal behaviour you learn how wrong that phrase is.

And yes, it applies to what is on the street, because the issue of computers and privacy is about what scales today that didn't scale before. Sure, people could always see you, but nobody could watch everybody all the time. Now they can.

The question is whether the same governments or societies would shame, punish, and ostracize just as much or even more if they had fewer data.

Another question is whether one can prevent this at all. Cameras are small enough to be unnoticed, storage is cheap, etc.

There was recently a court decision in Germany saying that dashcams could be used as evidence, even if the data were initially captured by chance. (Previously, this was possible only if one observed a crime first (which, of course, no-one can prove) then turned on the dashcam.

If much more data is collected, society can adapt to it. Perhaps it must, because there is probably no way to prevent it.

Again, comparing countries can be interesting. The USA is relatively restricted as far as data goes, yet the government does all kinds of things we probably both don't like. Countries where the state has many cameras in public places and/or "private" data such as tax returns etc are public are not totalitarian states.

Indeed, many places are not totalitarian today. The problem with deployment of massive surveillance technology in good societies is that sometimes it doesn't stay that way. Societies change, and now the question of whether they have a powerful police state apparatus is a matter of policy, not physicality. Just load new firmware and you have a police state. Or worse, your country gets invaded and your own records and surveillance data get used against your people, as happened to Dutch Jews. How many holocausts is too many holocausts? Or the tech is re-exported to countries which are already police states.

I see what you are getting out, but probably there is no way to prevent export of technology which might be abused; that has been happening already for decades.

As for a state turning into a police state via a "software upgrade", the question is whether a state so evil that it would do that would say "Oh, we don't have enough data, so we can't kill as many people as we want to". I just don't see lack of data as an effective defense against dictatorships.

A police state isn't a binary thing. The point is that if we build the infrastructure for one, and trust the government not to use it, we have now made it a question of policy rather than implementation. A scary terrorist (or whatever) is caught, and people say, "let's add surveillance technique X" and it's just a matter of configuring already installed hardware. That's different from "We would have to install billions of dollars in hardware over the course of 10 years."

In government, important rights get protection via constitutional rules rather than laws because changing constitutional rules is hard and takes a long time.

Snowden and the warrantless wiretap leaks show us that once the physicality is in place, it's easy for the policy to change, sometimes in secret.

Add new comment