UK Police Are Watching You: Facial Recognition and Racial Bias

New Scotland Yard
Photo: By the author

Ed Bridges, a father of two, was attending a peaceful anti-arms protest in Cardiff when he noticed CCTV cameras capturing him using automatic facial recognition (AFR). The year earlier he noticed his image was also captured while doing his Christmas shopping.

Orwellian dystopia of Big Brother watching your every movement is not fictional fantasy anymore. That level of intrusion has become reality in the UK. British police forces are eager to rollout controversial AFR technology echoing surveillance state’s like China which infringe on privacy under the aegis of crime prevention and security.

A major legal case against facial recognition software brewed for South Wales Police who were sued by Bridges. He claimed his privacy was invaded when his image was captured.  Bridges was represented by campaign group Liberty in the lawsuit who stated that facial recognition violated both the Human Rights Act and the Data Protection Act.

South Wales Police believe AFR can serve the public in preventing crime. “For police it can help facilitate the identification process and it can reduce it to minutes and seconds,” they stated to WIRED. 

AFR systems capture images of peoples faces which are then uploaded into existing ‘watch  list’ databases compiled of people previously taken into police custody. The police can flag a face within the database and compare it to the over 20 million images of faces already existing within the system. The data is deleted immediately after the system processes photos with no matches.

cctv-2844931_1280
Photo: By Pixabay

“It’s hard to see how the police could possibly justify such a disproportionate use of such an intrusive surveillance tool,” Bridges proclaims on Liberty’s website. “We hope that the court will agree with us that unlawful use of facial recognition must end, and our rights must be respected.” 

The Divisional Court rejected the lawsuit for further judicial review in September, 2019. The Court concluded that the use of AFR did not violate any legislation because it was only put into practice for protective purposes while only comparing the biometric data against preexisting watch lists. While the case drew attention to AFR and challenged its implementation there is still no clear law against it.   

The Metropolitan Police stated that they conducted 10 trials of AFR in London between 2016 and 2017 using the Japanese company NEC’s Neoface technology. The statement suggests that the police plan to use this system again in the future.

Petra Spence sells jewelry on Portobello Road in Notting Hill. She was celebrating her Guyanese roots at Notting Hill Carnival in 2017 amongst her friends and community. She was unaware that her face could have been scanned by AFR.

“I had no idea the police did that,” Spence exclaims to me shocked that she could have been targeted. “That invades my privacy. Yes I get that they are doing this to protect from crime but it seems too specific that they chose an African and Caribbean celebration.”

With this growing ubiquity of facial recognition by UK authority concerns are raising over the inherent biases within these AI systems.

Between 2016 and 2017 facial recognition trials were deployed by the Met Police at Notting Hill Carnival. The civil liberties campaign group Big Brother Watch reported that the trials had over 98% false matches.

Met-AFR-98
Photo: By Big Brother Watch

Racially biased facial recognition is at the core of why the Afro-Caribbean celebration at the Notting Hill Carnival AFR trials went so wrong. There were 102 false-positive matches, a majority of which were of black faces because AI systems statistically misidentify them.

“The causes of this algorithmic discrimination may vary, but are likely due to the fact that the datasets on which the algorithms are trained contain mostly white and male faces, as well as the fact that cameras are not configured to identify darker skin tones” states Big Brother Watch.

Benjamin Bowling, Professor of Criminology & Criminal Justice at King’s College London and founding member of policing watchdog StopWatch, gave me his insight into racial biases of AI used in policing.

“This is a black-data problem,” argues Bowling.  “People from black and ethnic minority communities are more likely to come into contact with the police, and therefore, like other crime data, images of people of color are more likely to be captured, stored and potentially used as evidence.”

Bowling believes the black community in London is already over-policed. If AFR is deployed in areas with mainly black and minority groups it risks perpetuating the issue. Therefore data-driven policing through facial recognition can preserve racial biases and amplify existing tensions with the police and their community.

The Met Police commissioned academics from the University of Essex to perform an independent evaluation of their AFR trials, according to the MIT Technology Review. The study found that 81% of the NeoFace system had inaccurate results. Therefore the facial recognition software in a majority of these cases flagged peoples faces who were not on the watch list.

Big Brother Watch believes the collusion between police and privately owned sites using facial recognition is an epidemic in the UK. “We now know that many millions of innocent people will have had their faces scanned with this surveillance without knowing about it, whether by police or by private companies” states the organization.

Their investigation has found that surveillance nets have been cast across popular public spaces on private UK lands including shopping centres, museums, and casinos. Included in these locations was the usage of facial recognition in Sheffield’s Meadowhall mall which scanned over 2 million faces of visitors in 2018. The shopping centre is owned by British Land which also owns land in London’s Ealing Broadway, Paddington, and Canada Water.

Like out of a page of George Orwell’s 1984 the UK Police are watching us and there is not much we can do to stop it.

“Anyone can refuse to be scanned,” state the Met Police. “It’s not an offense or considered obstruction to actively avoid being scanned,” they insist.

However, most people scanned like Petra Spence are entirely unaware of the cameras capturing their faces.