Surveillance Cameras Will Soon Be Unrecognisable – Time for an Urgent Public Conversation

It is often argued that the UK is the most surveilled country on the planet. This may or may not have been the case in the past but there are certainly now millions of surveillance cameras in public spaces – not to mention private buildings and homes. Behind those lenses they are changing in ways that people are often barely aware of, with privacy implications that should be widely discussed as a matter of urgency, especially facial recognition capabilities.

Automatic facial recognition is currently the hot ticket in this industry, having been introduced in a number of cities around the world, in the US, China, Germany and Singapore. The police argue that piloting such systems has allowed them to test the technology to help identify potential terrorists and other known offenders. Yet this has to be weighed against different concerns. The broadest is our expectation of privacy and anonymity in public places – and whether this is a step too far towards our every move being visible to the state.

Then there is the question of how well these facial recognition systems work at present. Their success rate at facial recognition has been shown to be as low as 2%. Linked to this is an inbuilt bias within the software that makes the technology far less accurate at identifying darker skinned people and women. It therefore has the potential to exacerbate tensions between ethnic minorities and the police.

This could be compounded by another contentious issue, which is the police using so-called “watch list” databases of faces against which it is trying to match live images. Typically these databases include policing images of people taken in custody, who may never have been convicted of a crime and are unlikely to have consented to their data being used in this way.

For these reasons the use of automatic facial recognition software has been very controversial, and until the technology is more reliable we should probably be very cautious in how we use it. There have been two significant pilots in the UK in recent times, in the south of Wales and in London. Both are the subject of judicial review actions, brought respectively by civil liberties organisations Liberty and Big Brother Watch, which are due to conclude in the coming months.




Read more:
Setting precedents for privacy: the UK legal challenges bringing surveillance into the open
Policing the Berlin Wall: the ghostly photos taken by the Stasi’s hidden cameras


In the US, meanwhile, the city of San Francisco banned the use of facial recognition in its public systems in May. Other American cities are expected to follow suit – with facial recognition software currently being used in the likes of Chicago, New York and Detroit. The technology has also generated much debate in Canada, where it is in use in Toronto and some other cities.

Tomorrow’s facial recognition world

Facial recognition highlights bigger questions around which types of surveillance cameras and systems are acceptable to society. This question is complicated by the fact that surveillance cameras are becoming more sophisticated and computerized without necessarily looking much different. There is no signage or information that tells us about their enhanced capabilities, which means the activities behind them become less transparent.

As the technology has been miniaturized and costs have fallen, new types of cameras have emerged, including body-worn video devices, drones, dash and head cams. At the same time, imaging and recording techniques have become more and more standardized. This has allowed for greater connectivity between systems and has raised quality to the point that images can be trustworthy evidence in legal proceedings.

Besides facial recognition, we are seeing the emergence of cameras capable of object tracking and recognition, plus advances in noise or smell analysis. Police forces in the US and UK have been trialing systems that predict how likely individuals are to commit a crime. It is all a quantum leap away from the old CCTV cameras with which we are familiar.

Governance and regulation is having to evolve quickly to keep abreast of this environment. To this end, surveillance cameras in England and Wales are now regulated by the specialist office of the Surveillance Camera Commissioner; along with the Information Commissioner’s Office, which has responsibility for overseeing data protection in the UK. Also relevant to the use of facial recognition systems is the Office of the Biometrics Commissioner.

Surveillance Cameras Day

Most surveys suggest that the public are in favour of basic CCTV cameras, but the question for those who set the rules is whether citizens would still support these systems if they knew what they were becoming capable of. Judging by most reactions in the media to facial recognition, it seems not.

I suspect that most of the advances in technology could be used to improve the system if they were regulated properly, but cameras must be seen to be delivered in the interests of society and with the support of voters. So where should policymakers draw a line in the sand?

To help with this, a world first is about to take place in the UK on June 20: Surveillance Camera Day. This is not intended to be a celebration of surveillance cameras but to allow people to influence how they develop by raising awareness about their capabilities, merits and consequences. It will include everything from open days at a number of CCTV control centres to public factsheets to discussions in the media. Everyone can contribute to the conversation through #cameraday2019.

The direction of travel for surveillance cameras does not need to be towards a defined technological determinism where it inevitably becomes more and more intrusive. Surveillance Camera Day represents an opportunity for everyone to help shape the discussion. It will be interesting to observe how the general public and other players respond.

William Webster, Professor and Director, Centre for Research into Information, Surveillance and Privacy, University of Stirling

This article is republished from The Conversation under a Creative Commons license. Read the original article.

One Comment:

  1. quite ironic that your site says you need Google Chrome to view this site correctly when talking about privacy risks. Google Chrome is the browser that collects the most data out of all. If you really would value privacy you would not advertise for chrome.

Leave a Reply

Your email address will not be published. Required fields are marked *