By Siobhan Doyle
According to new research by the University of Colorado (CU) Boulder, emerging facial recognition services often mischaracterise transgender and non-binary individuals. With a brief glance at a single face, emerging facial recognition software can now categorise the gender of many men and women with high accuracy. However, a study by University of Colorado Boulder, US, has found that if a face belongs to a transgender or non-binary person, such systems get their genders wrong more than one-third of the time.
“We found that facial analysis services performed consistently worse on transgender individuals and were universally unable to classify non-binary genders,” said Morgan Klaus Scheuerman, a PhD student in the university’s Information Science department.
“While there are many different types of people out there, these systems have an extremely limited view of what gender looks like.”
The study comes at a time when facial analysis technologies, which use hidden cameras to assess and characterise certain features about an individual, are becoming increasingly prevalent. For example, it is used in applications such as smartphone dating apps and the more controversial law enforcement surveillance systems.
Previous research has also suggested that such technology tends to be most accurate when assessing the gender of white men, but frequently misidentifies women of colour. An MIT study found that the error rates in determining the sex of dark-skinned women were 20.8 per cent, 34.5 per cent and 34.7 per cent.
“We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender,” said senior author Jed Brubaker, an assistant professor of Information Science. “We set out to test this in the real world.” Read more via E&T