Artificial intelligence not necessarily beneficial for LGBTI community

One of the most watched events of the year got Cynthia Weber wondering: can the use by Sky News of artificial intelligence (AI) at the wedding of Prince Harry and Meghan Markle be a good thing?

For the first time in history, a news broadcaster used AI facial recognition technology during a live broadcast. Cynthia, a professor of international relations and gender studies at the University of Sussex, explained that using software to name wedding guests may be a nifty trick, but there are worries about the implications.

“Some claim that this technology can identify a person’s sexual orientation,” Cynthia said while speaking during an event for the International Day against Homophobia, Transphobia and Biphobia in Geneva at UNAIDS headquarters.

Referring to a Stanford University study that analysed more than 35 000 images on a United States dating website of white, able-bodied, 18–40-year-olds, the researchers compared their AI-generated sexual orientations against sexual orientations researchers found in dating profiles. The study claimed that AI facial recognition technology could determine a person’s sexual orientation with up to a 30% greater accuracy than people can.

Cynthia said that LGBTI advocacy organizations labelled the study junk science—the study used a skewed sampling in terms of race and age and the study equates sexual orientation with sexual activity. “The result is that the study’s artificial intelligence algorithm only finds what it was programmed to find: stereotypes about straights, gays and lesbians,” said Cynthia.

Cynthia believes that AI knowledge may generate opportunities in many fields, but sees far more risks and dangers than advantages for LGBTI people.

When AI meets facial recognition technology and a sexual orientation algorithm, at least four issues arise. First, privacy. In national and international law, a person’s face is not protected by privacy laws. That allows faces to be scanned and read by everyone, from governments to Sky News.

Secondly, accuracy. “In a world beyond the royal wedding, artificial intelligence facial recognition technology is far from perfect, even when it just tries to match names with faces, much less when it tries to match presumed sexual orientations with faces,” Cynthia said.

For Cynthia, the key issue is knowledge. How does a sexual orientation algorithm know better than an individual his or her sexuality? Cynthia considers the binary approach of code and computer-readable data not compatible with the vast gender and sexuality spectrum.

Finally, the issue of what the AI information will be used for worries Cynthia. “Let Sky News use it for wedding commentary, but what if the police use it in countries where homosexuality is outlawed?” Cynthia asked.

For Cynthia, AI and sexual orientation are not necessarily mutually beneficial. Cynthia understands that AI influences imagination and drives innovation, but believes that categorization of people usually introduces more harms than benefits.

Cynthia concluded by saying, “People have to make sure that artificial intelligence is ethically driven, not just technologically driven.”

The event was organized with the Swiss LGBTI Pride@Work association and UN Globe, a United Nations-wide LGBTI organization, and was held on 16 May.

Read more via UNAIDS