Lately, social media giants have been doing a lot of apologizing over LGBT foul-ups.
Each individual incident has sparked outrage and generated national headlines, but taken together they paint a picture of an emerging crisis for these companies: When marginalized groups depend on your platform to build community, relying too heavily on algorithms can have unintended—and sometimes hurtful—consequences.
The roots of that crisis came into clearer focus last March when LGBT YouTubers noticed that many of their videos had been categorized as “restricted”—or “potentially inappropriate” for viewers who opt to turn on a content filter.
The company apologized on Twitter: “Sorry for all the confusion with Restricted Mode.” In a subsequent statement to The Daily Beast, YouTube explained that “some videos” had been “incorrectly labeled by our automated system.”
Sometimes, the symptoms of this crisis appear to be complex but are simple. In June 2017, Tumblr said it was “deeply sorry” for categorizing some LGBT-related posts as not suitable for work, or NSFW, in its Safe Mode. But the underlying issue, Engadget explained, was that Tumblr had flagged as inappropriate all posts from users who had marked their blogs as explicit, whether or not the posts themselves were NSFW.
But more often than not, these PR hiccups appear to have been due to technical systems that are behaving in unanticipated and convoluted ways. In November 2017, Twitter apologized after images posted under the hashtag #bisexual were filtered out of search results.
In a tweet, the company later blamed “a technical issue,” explaining that it tries to “identify sensitive media” by compiling “a list of terms that frequently appear alongside adult content.” In layman’s terms, because “bisexual” sometimes appears in pornographic posts, it was deemed guilty by association and put on the naughty list. Read more via Daily Beast