Tech Cynic: I know your face

Police facial recognition cameras are to be rolled out across London in spite of legal and ethical objections.

On the 24th of January the Metropolitan Police force in London announced that it would be rolling out a facial recognition camera network across London. This would be used to identify potential known suspects and to alert police officers.

Of course, to identify potential known suspects, everyone must be checked. Every person in the vicinity of one of the camera vans will be scanned, their facial features checked against a database of known suspects or wanted criminals. In the case of a match then officers will be alerted so that they can, well, nab the person and take them into custody for a little chat over a cup of tea and a biscuit... or something like that.

The system was already controversial before it began, with a London man being arrested and fined last year for concealing his face whilst walking past one of the AFR cameras during an unofficial trial into their use.

As is always the case with new intrusions into privacy, we are told that non-matching data is instantly discarded, and that matches are made solely against local police databases of suspects. In other words, a wanted criminal from Scotland could walk past one of the London vans with impunity, and of course if you have nothing to hide then you have nothing to fear. However, that's the situation today. Feature-creep means that the temptation of rolling out a nationwide AFR scheme is likely to be too strong to resist, and an easy sell on the grounds of public safety (just refer to said wanted Scotsman, for example).

From an ethical perspective it's easy to see the problems here. Western governments have, at least in the past, been highly critical of authoritarian states which excessively monitor and track their own citizens, yet the Met's AFR (Automated Facial Recognition) works in fundamentally the same way as the system used in China, which routinely scans the faces of its citizens and highlights any wrongdoing, such as littering or jaywalking, on large screens. It almost certainly records other behaviour too, but the Chinese are understandably unwilling to divulge much more information than the simplistic "happy public-spirited safety and cheerful tidiness" angles.

For those of us who understand the subtle interplay between privacy and freedom, this is frankly terrifying. CCTV was bad enough in this respect, and London already has some of the most pervasive surveillance in the world. Adding on live facial recognition is a whole new ball game. It would give any future oppressive regime a powerful tool with which to suppress any and all dissent. Could the UK have such an oppressive regime at some point in the future? The prospect hardly seems unlikely.

Then there's the problem that facial recognition systems are rarely as accurate as their makers claim, at least not in the real world. In this case, NEC, the supplier of the Met's AFR technology, claims a 70 percent accuracy rate based on internal trials, whereas past independent trials of this and similar tech solutions have shown success rates in the low-single-digit percentages.

To continue reading this article register now