Tre’Andre Valentine is the executive director of the Massachusetts Transgender Political Coalition.
In May, San Francisco became the first city in the world to ban the municipal government’s use of face surveillance technology. Days later, Somerville, Mass., became the first city on the East Coast to do the same. Now — thanks to a movement led by dozens of civil rights organizations nationwide — municipalities and states across the country are debating the government’s use of a technology that poses unprecedented threats to our civil rights and civil liberties.
Face recognition systems use computer algorithms paired with databases to analyze and classify images of human faces in order to identify or track people. The technology is currently entirely unregulated in the United States, but police departments and other government agencies are nonetheless using it — too often in secret. But it’s not like what you’ve seen on cop shows like CSI; face recognition doesn’t always work. And the inaccuracies are particularly damaging for certain groups of people, namely Black women and trans and nonbinary people.
A study conducted by Massachusetts Institute of Technology researcher Joy Buolamwini, for example, found significant racial and gender bias in facial recognition algorithms. Buolamwini found that face recognition algorithms can misclassify Black women’s faces nearly 35 percent of the time. These same algorithms almost always get it right when classifying white men’s faces.
But moreover, this technology too often fails to take into account transgender and nonbinary people. Studies show face recognition products sold by Amazon, IBM, and Microsoft consistently misclassify people in our community.
A critical shortcoming of this technology is that it has been programmed to read people as either male or female — a technological assertion that the gender binary is immovable, fixed, and here to stay. Even within the confines of this rigid binary, the tech has an extremely retrograde view of what “male” and “female” look like. For example, systems can be programmed to recognize short hair as a “male” trait or makeup as a “female” characteristic. These outcomes reflect choices made by computer programmers about which images they will use to train algorithms as well as how those training data are classified. Read more via Advocate