Tracking Faces and Targeting Races

Damilola Omotoso / Nov 5 / Human Rights

aa+facial-recognition-blog-website-blog.jpg

The world in which we live is in a constant state of change and development, and the new technologies birthed over time have made our lives easier and better. Yet, what do we do when this is not the case? What happens when technological advances hinder the lives it is supposed to help? This question is the reality for people of colour (POC) and women, who have increasingly found themselves targeted by the very technology that should protect them.

The Debates

Live Facial Recognition (LFR) identifies and locates people on the police's watchlist. Through doing so, the police can access a plethora of data regarding those who are deemed to be a risk to themselves, the public, or are wanted criminals. As such, LFR should theoretically keep the societies in which it is deployed safe. However, the use of LFR arouses the on-going debate about security vs. privacy: should the government have access to our private lives (and to what extent) if it is to keep us safe? Regardless of where you stand on the fallacious 'if you've got nothing to hide, you’ve got nothing to fear, ' argument, there are other perhaps more pressing issues with the use of this technology.

MIT and Stanford University have found that commercial-grade facial recognition systems are biased on the basis of both race and gender. Essentially, this tech reinforces and reproduces the systemic racism and gender discrimination found in the outside world. When prejudicial technology begins to categorise and sort people based on their skin colour and gender, it begins to cement the notion that white (or even just lighter) skin is "better”. These people do not feel the same level of safety as those who are less likely to be targeted by LFR. When surveillance becomes a security measure, the over-surveilled are branded as criminals due to the inaccuracies of the tech.

 

Defective Tech

Despite LFR's purpose of managing a population, it does not effectively do so for all people in a diverse society; heterogeneous communities consequently suffer.

This means that women and POC (note: the intersection of race and gender means that women of colour (WOC) will be the most negatively affected by LFR) are more likely to be misrecognised by LFR than their white male counterparts. In this instance, the case of mistaken identity can lead to the incorrect person being stopped, a wrongful arrest, and the actual culprit going free. Recognition and identity are intrinsically linked, yet it is harmful to both the individual and society as a whole if and when a particular person or group of persons are continuously misidentified.  

The inaccuracies of LFR perpetuate racial stereotypes and antiquated ideology that believe in masculinising Black (namely dark-skinned) women and feminising Asian men; these groups are thought to be the wrong person and also a different gender identity. Joy Buolamwini and Timnit Gebru found that facial recognition tech performs better on lighter skin and has an error rate of 8.1-20.6% for men and women, and 20.8-34.7% when these women have darker skin. Statistics like these demonstrate that POC and women do not benefit from the implementation of LFR, as it does not achieve its core aim of identification and verification when it comes to people who belong to these groups. It is also important to note that LFR would also fail in its categorisation of androgynous or non-binary people, as the system classes people as either male or female.

 

What Does This Mean?

Home Office data has already reported that Black and Minority Ethnic (BAME) people were 4.3 times more likely than white people to be stopped and searched in England and Wales by the police from 2018-2019. This figure rose to 9.7 times more likely when focusing solely on Black people in comparison to white people. Unsurprisingly, these stats were an important problem raised and highlighted in the recent Black Lives Matter (BLM) protests this summer. An increase in police powers, which are used most commonly by the Metropolitan Police (who preside over the area with the largest BAME population), is not likely a shock to most, in tandem with the use of LFR is detrimental to POC. If the London Met area is already more heavily targeted than other spaces in England and Wales, then the practice of LFR - which is more likely to misrecognise and misidentify those with darker skin - is not useful in a region where the largest proportion of these people live. This criticism was also voiced when LFR was used at Notting Hill Carnival in 2017, as this is a predominantly Black event, which is the celebration of Caribbean (and now also wider Black British) culture. During the event, 5 people were wrongly identified by LFR and had to prove their identity, which should be the job of the technology.

 

The Future

Despite human rights groups such as Big Brother Watch and Liberty calling for the banning of LFR, at present, it seems to be here to stay. King's Cross and Oxford Circus stations are amongst the latest London areas to use LFR. However, a review across its use has shown that only 6 out of 42 matches were correct. As such, if LFR is to be a part of our lives, there needs to be a vast improvement in its accuracy and deployment in order to avoid unfairly targeting POC and women.


Previous
Previous

2020’s White Saviour Complex

Next
Next

Arrested for Being Human