For International Women’s Day last month in Berlin, some ticket machines used automatic gender recognition, a form of facial recognition software, to give female riders discounts on their tickets. As well-intentioned as that may seem, AI researcher Os Keyes is worried about how such systems will negatively impact the lives of transgender people or those who do not adhere to strict binary definitions of male or female.
A recipient of the Ada Lovelace Fellowship from Microsoft Research, Keyes served as an expert witness for facial recognition software regulation being considered by lawmakers in the state of Washington and was cited earlier this month by a group of more than two dozen AI researchers who say Amazon should stop selling its facial recognition software Rekognition to law enforcement agencies.
In the instance of, say, rent-stabilized apartment buildings in New York where facial recognition systems are being proposed for entry, poorly made systems could provide negative user experiences for transgender or gender-neutral people who may encounter trouble opening the door. But Keyes also fears such systems could lead to increased encounters with law enforcement that lead to trans people getting discriminated against, overly monitored, or killed.
Keyes is especially concerned because their analysis of historic facial recognition software research found that the industry has rarely considered transgender or gender-fluid people in their work. This has led them to believe facial recognition software is an inherently transphobic technology. Although the National Institute of Standards and Technology’s (NIST) facial recognition software testing system has been called a gold standard, Keyes staunchly opposes the organization, which is part of the U.S. Department of Commerce.
They take issue with NIST’s mandate to establish federal AI standards detailed in Trump’s executive order, the American AI Initiative. NIST is the wrong organization to lead the creation of federal AI standards, they said, because it employs no ethicists, it has a poor history with matters of gender and race, and its Facial Recognition Vendor Test (FRVT) uses photos of exploited children and others who did not provide their consent.
Research by Keyes and others due out later this year examines the history of facial recognition software and organizations like NIST, some of which was shared last month in a Slate article.
Keyes recently spoke with VentureBeat about the potential dangers of automatic gender recognition (AGR), facial recognition software regulation being considered by the U.S. Senate, and why they believe all governments use of facial recognition software should be banned. Read more via Venture Beat