Technology is amazing and the more we have, the better — or at least, that’s often what we’re told.
What if it opens up some communities to increased policing and surveillance? Do we try to fix it, or do we cut our losses?
These are some of the questions defining conversations around facial recognition.
Face recognition is a “method of identifying or verifying the identity of an individual using their face,” according to the Electronic Frontier Fund.
It’s a form of biometric identification (like fingerprinting) because it uses body measurements and physical characteristics to match a scan to a person.
Facial recognition is the technology Facebook previously used to tag people in photos, and the reason you can unlock your phone with just a look.
But its use is deeper than that.
In the United States, at least 42 federal agencies use facial recognition technology, and real-time facial recognition has been deployed in cities like Detroit and Chicago.
If you’re not a cis white guy, chances are high that facial recognition won’t be able to do the one thing it’s meant to do: identify you.
This is a problem across companies and algorithms. In 2019, researchers testing popular services from Amazon, Clarifai, IBM, and Microsoft found they were unable to classify transgender or nonbinary people.
You could dedicate entire lectures to unpacking why facial recognition is so bad at “seeing” people. But to boil it down, remember:
The algorithms making up facial recognition have to be built by somebody. And those somebodies will build their own implicit biases into those algorithms.
Think of how rampant misogynoir is in the U.S. Now, consider that a 2019 study found Amazon's Rekognition often classified dark-skinned women as men.
As many as 1 in 4 police departments across the U.S. can access facial recognition — and its use is largely unregulated.
Facial recognition has already led to false accusations against Black men, and law enforcement officers have used it to target protesters.
Often, these databases are built without people’s knowledge. In 2019, IBM released its “diversity in faces” dataset to reduce bias in facial recognition. Nearly a million photos were pulled from Flickr — but most of the people photographed had no idea.
That same year, The Washington Post reported that state driver’s license photos are also a “gold mine” for facial recognition.
The companies making facial recognition technology will always argue that it should exist because, well, they’re making money.
Similarly, law enforcement will argue in favor of facial recognition because it gets them more money in their budgets.
Facial recognition is ultimately a surveillance technology.
What happens when law enforcement are able to perfectly identify everybody at the next protest against police brutality?
Many advocates say there’s no need to try “improving” facial recognition when we could instead just acknowledge that it doesn’t need to exist.
In a country where crime prevention already associates Blackness with inherent criminality, why would we fight to make our faces more legible to a system designed to police us? … It is not social progress to make Black people equally visible to software that will inevitably be further weaponized against us.
Several cities, including San Francisco, Boston, and Somerville, Massachusetts, banned the technology’s use in police investigations and municipal surveillance programs.
And multiple states, including Maine, have also banned most government use of facial recognition.
These bans have their limitations, though. Most of them focus on government use of the technology, while allowing private entities to continue playing with it largely unchecked.
Plus, city- or state-level bans don’t have any bearing on the federal government.
Companies should not be able to profit off of surveilling people. And individuals shouldn’t have to constantly worry about being watched.
A number of organizations, individuals, and communities have taken up the fight against facial recognition — as well as an overall reimagining of how technology can better serve marginalized communities.
If you want to learn more, check out Detroit Community Tech, the Surveillance Technology Oversight Project, and the Lucy Parsons Labs.