Live facial recognition cameras are a form of mass surveillance, human rights campaigners have said, as the Met police said it would press ahead with its use of the “gamechanging” technology.
Britain’s largest force said the technology could be used to catch terrorists and find missing people after research published on Wednesday reported a “substantial improvement” in its accuracy.
The research, carried out by the National Physical Laboratory (NPL), found there were minimal discrepancies for race and sex when the technology was used at certain settings. It was commissioned by the Met and South Wales police in late 2021 after fierce public debate about police use of the technology.
But the human rights groups Liberty, Big Brother Watch and Amnesty have said the technology is oppressive and “turns us into walking ID cards”.
Lindsey Chiswick, the Met’s director of intelligence, said the technology would allow officers to be more focused in their approach to tackling crime, including robbery and violence against women and girls.
“It is going to be gamechanging in terms of being able to accurately identify people that we want to talk to,” she told reporters at a briefing at New Scotland Yard.
DCI Jamie Townsend, the Met’s operational lead for facial technology, said CCTV cameras on a police van could stream images to a facial recognition application. The images were then checked against images of people on a watchlist compiled by officers, with images of people not on the list pixellated and deleted, he said.
Chiswick said the technology was deployed “if there is an intelligence case to do so”. She said it was used last summer in Oxford Street, in central London, after a spate of robberies, including of high-value Rolex watches.
Townsend hailed the technology’s successes, saying: “We’ve had people that we’ve stopped, who were wanted for conspiracy to supply large amounts of drugs, who had been outstanding to the police for two years.
“We’ve had another gentleman who was arrested who had absconded from prison. He was also wanted in a foreign country for grievous bodily harm and drugs importation and so he had been placed on our watchlist, [it was a] quite old image of him [but] he was walking along Oxford Street and it alerted against him.”
But human rights groups hit out at the “Orwellian” technology, which has been found to be less accurate for black people at different settings, and less accurate for people under the age of 20.
The research by NPL found that for watchlists of 1,000 or 10,000 people, “true positive identification rate” of live facial recognition was 89%. Previously, for a watchlist of 2,000 to 4,000 people, it was 72%.
The Met said that at the setting it was currently using, the chance of a false match was 1 in 6,000 people.
But Madeleine Stone, legal and policy officer at Big Brother Watch, said: “One in 6,000 people being wrongly flagged by facial recognition is nothing to boast about, particularly at deployments in large cities where tens of thousands of people are scanned per day.
“If rolled out across the UK, this could mean tens of thousands of us will be wrongly flagged as criminals and forced to prove our innocence.”
She added: “Live facial recognition is suspicionless mass surveillance that turns us into walking ID cards, subjecting innocent people to biometric police identity checks. This Orwellian technology may be used in China and Russia but has no place in British policing.”
Katy Watts, a lawyer at Liberty, said the technology “sows division” and would be disproportionately used on communities of colour.
“This report tells us nothing new – we know that this technology violates our rights and threatens our liberties, and we are deeply concerned to see the Met police ramp up its use of live facial recognition. The expansion of mass surveillance tools has no place on the streets of a rights-respecting democracy,” she said.
Oliver Feeley-Sprague, Amnesty International UK’s military, security and police director, referred to the recent review by Louise Casey, which found that the Met was institutionally racist, misogynist and homophobic.
“Against the appalling backdrop of the Casey report and evidence of racist policing with stop and search, the strip-searching of children and the use of heavily biased databases like the gangs matrix, it’s virtually impossible to imagine that faulty facial recognition technology won’t amplify existing racial prejudices within policing,” he said.