The use of facial recognition technology in public spaces has been an area of contention for quite some time. The issue reared its head most recently when the UK Information Commissioner’s Office (ICO) raised privacy concerns related to the use of live facial recognition (LFR) technology in such areas.
Information commissioner Elizabeth Denham published a ‘Commissioner’s Opinion’ article which brought to light that of the six LFR systems investigated by ICO none had been fully compliant with data protection law when they went live.
Denham wrote that trust in LFR technology is low, using the piece to highlight the need for data protection and privacy to be at the heart of any decisions to deploy LFR technology.
When is it right to use live facial recognition tech?
LFR technology is still in its infancy and, like most infants, is clearly proving itself to be challenging and disruptive. However, it has an opportunity to be a force for good if it's used correctly, gains public trust, and meets necessary societal, legal and regulatory requirements.
In her opinion piece the information commissioner said that LFR is not necessary a bad thing, and that we have to consider its purpose and what LFR is likely to be used for and the benefits it could bring.
“There are many acceptable use cases of LFR,” notes Tony Porter, Chief Privacy Officer at Corsight AI. “[These include] access control with explicit consent of employees, airport security ecosystems, serious and organised crime requirements and national security to name but a few.
“One only has to foresee its use in helping identify a vulnerable missing young person, or even elderly people suffering from forms of dementia. It often depends on the context and manner of the surveillance as to its legitimacy.”
Porter goes on to mention that the benefits of LFR are numerous: from making public spaces safer and more secure through to smoother access to and travel through transport hubs to name but a few. “When LFR is used within an ethical, moral framework that prioritises privacy and the rights of individuals, the benefits to this technology far outweigh the risks,” he says.
Public views on facial recognition technologies
Today the general public is familiar with facial recognition technology in a variety of contexts, but these experiences are likely drawn from situations where it was being used with our knowledge, consent and active participation.
The valid concern at the heart of the Commissioner’s Opinion piece is that LFR deployments in public spaces could lead to large scale scanning and data collection without the public necessarily knowing about it or having any choice in the matter. Denham says in the piece that we should be able to take our children to a leisure complex, visit a shopping centre or go sightseeing in a new city without having our biometric data collected and analysed with every step we take.
“The information commissioner’s post suggests that ‘building trust and confidence in the way people’s information is used is crucial so the benefits derived from the technology can be fully realised’ and this ought to be the case,” says Steven Furnell, professor of cyber security at the University of Nottingham.
“However, people have been adopting online services with potential privacy concerns for some time. Indeed, the public never questioned how their information might be used when surfing their favourite website or downloading their chosen app.
“Over the last few years of course, public opinion has shifted dramatically and social media companies are finding themselves under the spotlight. However, in most cases there’s at least a quid pro quo for the individual. A company may be harvesting their data, but one could argue that the end user is at least enjoying their desired service in return. With LFR there’s no consent and no quid pro quo for the individual, which seems like a rather uneven trade, particularly when it comes to personal data.”
Will regulations help LFR earn the public’s trust?
In her article, Denham notes that rules need to be clear – and adhered to – in order to gain and retain public trust. Without this trust we’re already seeing examples of LFR implementations paused and even banned in some US cities.
Perhaps the swathe of new regulations and laws being considered or rolled out will provide the assurances people need to trust LFR technology, and in turn enable society to reap its benefits.
These are being established all around the globe – in Europe this April for example, the EU Commission proposed new rules that will place strict obligations on the use of “high risk” AI, such as public surveillance, under which LFR sits.
Over in the US, Porter’s seeing a patchwork of legislation across the states, which is placing the technology under a range of provisions from state to state without one US-wide harmonising criteria.
“Across the world we’re seeing these approaches replicated. While this patchwork of regulations doesn’t provide global certainty for the industry, it fairly reflects the challenges new biometric technology confronts when facing modern legislative systems – digital technology challenges an analogue legislature,” he notes.
What businesses should consider when looking into live facial recognition solutions
LFR does have a role to play but only if less intrusive options aren’t up to the job.
Organisations considering implementing a LFR solution must be able to justify that “its use is fair, necessary and proportionate in each specific context in which it’s deployed,” says Denham, and “they will need to demonstrate high standards of governance and accountability from the outset”.
She also points out that businesses must understand and assess the related risks – for example, there are still issues around facial recognition accuracy and bias that could lead to misidentification.
If, after all due diligence has been taken, you still believe LFR is the right tool for the job, then Porter leaves you with this final piece of advice.
“You can’t take an ‘emperor’s new clothes’ approach to LFR implementation. You must provide sufficiently detailed, robust and transparent laws and guidance to properly direct and constrain the use of technology to ensure that equality, ethics and the protection of citizens is at the heart of such endeavours,” he concludes.