Shropshire Star

Cressida Dick: Facial recognition critics should face up to victims of crime

The head of the Metropolitan Police told a conference the only people who benefit from not using such technology are criminals.

Published
Last updated
Met police trial facial recognition cameras

Privacy concerns over live facial recognition are ‘much smaller’ than the need to protect the public from ‘a knife through the chest’, Britain’s most senior police officer has said.

Metropolitan Police Commissioner Dame Cressida Dick told a conference in Whitehall that critics of the use of such technology would need to justify to victims of crime why police should not be allowed to use these methods.

She told delegates at the Royal United Services Institute: “I and others have been making the case for the proportionate use of tech in policing, but right now the loudest voices in the debate seem to be the critics – sometimes highly inaccurate and or highly ill informed.

Live facial recognition: how it works
(PA Graphics)

“It is for critics to justify to the victims of those crimes why police should not use tech lawfully and proportionally to catch criminals who caused the victims real harm.

“It is not for me and the police to decide where the boundary lies between security and privacy, though I do think it is right for us to contribute to the debate.

“But speaking as a member of public, I will be frank. In an age of Twitter and Instagram and Facebook, concern about my image and that of my fellow law-abiding citizens passing through LFR (live facial recognition) and not being stored, feels much, much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest.”

She said that if artificial intelligence could help identify potential terrorists, rapists or killers, most members of the public would want them to use it.

Dame Cressida said: “If, as seems likely, algorithms can assist in identifying patterns of behaviour by those under authorised surveillance, that would otherwise have been missed, patterns that indicate they are radicalising others or are likely to mount a terrorist attack; if an algorithm can help identify in our criminal systems material a potential serial rapist or killer that we could not have found by human endeavour alone; if a machine can improve our ability to disclose fairly then I think almost all citizens would want us to use it.

“The only people who benefit from us not using lawfully and proportionately tech are the criminals, the rapists, the terrorists and all those who want to harm you, your family and friends.”

Cressida Dick
Dame Cressida Dick said critics of such technology would need to justify to victims of crime why police should not be allowed to use it (PA)

Recent Metropolitan Police use of facial recognition led to the arrest of eight criminals who would not otherwise have been caught, delegates heard.

But use of the technology has been criticised as a violation of privacy.

Silkie Carlo, from civil liberties group Big Brother Watch, said: “It is purely magical thinking to suggest facial recognition can solve London’s problem with knife crime. It’s a highly controversial mass surveillance tool. It seriously risks eroding trust between the police and the public.

“The commissioner is right that the loudest voices in this debate are the critics, it’s just that she’s not willing to listen to them. Her attempt to dismiss serious human rights concerns with life or death equations and to depict critics as ill-informed without basis only cheapens the debate.

“These are the kind of tactics you might expect from politicians, but not a police chief. Indeed, it seems to be police, not parliament, who are now making policy decisions in this area.”

People scanned by the cameras are checked against “watchlists” – said to contain suspects wanted by police and the courts – and approached by officers if there is a match.

The Met claims that the technology has a very low failure rate, with the system only creating a false alert one in every 1,000 times.

However, using a different metric, last year research from the University of Essex said the tech only achieved eight correct matches out of 42, across six trials it evaluated.

The latest algorithm used by the Met is said to show no bias on the base of ethnicity, although it is less accurate for women than men.

Hannah Couchman, policy and campaigns officer at Liberty, said: “Anyone can be included on a facial recognition watch list – you do not have to be suspected of any wrongdoing, but could be put on a list for the ludicrously broad purpose of police ‘intelligence interest’.

“Even if you’re not on a watch list, your personal data is still being captured and processed without your consent – and often without you knowing it’s happening.

“Facial recognition is a mass surveillance tool that undermines our rights to privacy and free expression. Any one of us might wish to go about our business anonymously and maintaining the right to do so does not make you worthy of police suspicion.

“The Met started using facial recognition after ignoring its own review of a two-year trial which found that its use of this technology had failed to meet human-rights requirements.

“By scaremongering and deriding criticisms rather than engaging with these concerns, Cressida Dick reveals how flimsy the basis for the Met’s use of this oppressive tool really is.”

Sorry, we are not accepting comments on this article.