Lambeth Bridge. October 15th. Young protesters were on their way to join the Extinction Rebellion action when they had to face a dozen police officers. ID were required to cross the bridge. This identification process, done on political grounds, has been denounced by several rights groups, including Netpol, an organisation aiming at monitoring police abuses. Yet, the metropolitan police resorts to facial recognition on a daily basis and many organisations are trying to raise awareness about this phenomenon.
In a September joint allegation posted by Big Brother Watch, more than 25 race equality, rights and technology organisations called for the end of facial recognition in public surveillance.
Samir Jeraj, the Policy and Practice Officer of Race Equality Foundation, explained via email that Big Brother Watch, a civil liberties organisation, had ‘coordinated the statement, so they reached out to organisations that had either expressed concern before or had done work on similar issues. For example, we’d done some work on data-sharing by the Home Office.’
It is not surprising to see how worried rights groups and race equality organisations can be when it comes to facial recognition. A BBC feature has recently unveiled that the system still failed identifying people with dark skin. The NEC’s NeoFace technology used by the police to scan the pictures calculates the distance between people’s eyes, nose, mouth and jaw. Yet, according to this report, the system usually fails mapping correctly people’s features because of the shade of their skin. Despite acknowledging improvement had to be made, the Home Office told the BBC this technology would be used regardless.
This is now a far-reaching concern: San Francisco even banned the use of facial recognition in May 2019. In the edge of globalization, the digital strategies and regulations adopted abroad matter, especially when it comes to human rights.
‘The data on BME/POC being more likely to be misidentified came from the US, and we understood it to be likely to be an issue for the tech being used in the UK. The lack of transparency and accountability on its use in the UK didn’t help alleviate these concerns’. Samir Jeraj said.
Using facial recognition despite its aftermaths is a landmark decision for the Race Equality Foundation representative: ‘The UK traditionally operates through a principle of ‘policing by consent’, which this type of tech violates without that open conversation’.
Thus, the UK police has been urged to address the facial recognition issues and to put its bias on the table for a large debate as surveillance is now a growing concern. Samir Jeraj reported that Black and Minority Ethnicity were ‘more aware of the bias issues’ owing that they were often undergoing a ‘higher level of policing and surveillance’.
London is now familiar with surveillance as cameras are omnipresent, be it in the TFL transport or in the streets. It has been estimated by one of the main London CCTV Installers ‘Caught on Camera’ that a Londoner is monitored 300 times a day.
If face scanning is an undeniable technological breakthrough, it is now highlighting the numerous unclarities of the law. Hence the ongoing trials against the police use of facial recognition in public events.
The Notting Hill Carnival or the Download Festival 2015 in Castle Donington are only a few examples of this wide-ranging phenomena. The former is known for being a technological failure of the police as erroneous arrests were made because of misidentification, the latter, to be the first use of surveillance at an outdoor event in 2015.
This is all the more important as the police database now gathers more than 20 millions of faces. The police legitimacy may be at stake with multiple trials. However, according to the UK High Court, South Wales Police was proved right identifying suspects in the streets in September. The Welsh officers can now benefit from a facial scanning app on their phones, paving the way for the other police forces. The police in Scotland is now using facial recognition retrospectively and could resort to live facial scanning technologies.
In July 2019, the Metropolitan Police updated a statement on their official website, once the trials had come to an end. Security is the argument put forward. Indeed, the facial recognition system is feeding a database that is supposed to both deter and prevent offences. The pictures forwarded are compared by police officers to a ‘watch list’ of people who have been in custody before. Depending on the decision taken, the wrongdoer can be arrested.
Privacy is also tackled in this allegation. The police forces especially highlight the monthly data erasure.
‘The system will only keep faces matching the watch list, these are kept for 30 days, all others are deleted immediately. We delete all other data on the watch list and the footage we record. […] Anyone can refuse to be scanned; it’s not an offence or considered ‘obstruction’ to actively avoid being scanned.’
Yet, although transparency is mentioned as well as police officers allege they are distributing leaflets when facial recognition is operating, they tend to turn a blind eye on discrimination bias.
Rights groups are not the only ones claiming that the issue can no longer be silenced. Several MPs are calling for a ban of facial recognition cameras as this technology would infringe on human rights. A Scottish Parliament’s sub-committee is supposed to investigate the legal basis of this technology. The inquiry intends to supervise the uses of facial scanning data by creating a Biometrics Commission.
If debated in Holyrood, this matter is now pointed out by many organisations throughout the UK such as Liberty, Race On The Agenda (ROTA) or Netpol. The members of Parliament who were part of the Joint Statement on Big Brother Watch’s website aim at opening a broader discussion should it be about privacy violations and discriminations implied by the facial recognition. With the exposure of Extinction Rebellion protesters to the facial recognition systems, the stakes are getting higher, especially because civil disobedience is blurring the boundaries of the ‘watch list’ held by the police.