fbpx

The Sexism in Your Favorite Algorithms

By

Apr 25, 2018

Share

Technology has long been heralded as egalitarian. It’s the way technology is used, advocates argue, that makes it biased or wrong. But that argument breaks down when it comes to the kind of artificial intelligence that is increasingly powering the services we use on a daily basis: Google and social media. The algorithms that run these platforms are designed to be self-learning. That means they are designed to pick up on patterns and acquire and build upon those patterns.

The only problem is, they’re picking up and amplifying patterns of gender bias.

It started last year, with Google Translate under fire as its algorithm imbued genderless languages with gender stereotypes in the process of translation. For instance, the service translated the Finnish pronoun “hän,” which can mean either he or she in Finnish, only into “he” in English; for the Turkish gender neutral pronoun “o,” the algorithm translated it either into he or she, depending on the occupation in the sentence. (“He” was a doctor; “she” was a nurse.)

Now, two new examples of AI-fueled sexism are coming out.

In the first, a recent human-AI collaboration involved composing a fairy tale in the vein of the Brothers Grimm — complete with their outdated gender stereotypes. As Quartz recently reported, in “The Princess and the Fox”:

… the text constantly reminds the reader that the princess is kind, good, joyous, and, most importantly, beautiful. While she does, at one point, refuse to act according to her father’s wishes, it is the only mark of rebellion in a character whose actions otherwise seem limited to dancing, smiling, or crying.

The only other female character, the queen-mother, merely nods; the five male characters, however get a much larger range of action.

Men’s actions are at the heart of the second example. Specifically, the fact that a recent study found men were 1.2 times more likely to ‘like’ or comment on other men’s Instagram photos rather than women’s, while women were just 1.1 times more likely to engage with other women. It’s a small difference, but one with an outsized impact, thanks to the kind of recommendation algorithms that power Instagram and other social media sites. Examining data from 2014 (after Facebook bought the company, but before automated prompts made it easier to connect with friends-of-friends), researchers found women whose photos were slightly less likely to be ‘liked’ or commented on became even less popular once recommendation algorithms were introduced.

In the sample of 550,000 users, women outnumbered men (54% to 46%), though men’s photos tended to be better received: 52% of men received at least 10 ‘likes’ or comments compared to 48% of women. When the researchers applied two widely used recommendation algorithms, the gender inequality worsened. The percentage of women connected to, or predicted to be recommended to, at least 10 other Instagram users fell further — to only 30 to 36%, depending on the algorithm applied.

Perhaps surprisingly, the gender inequality was greatest among Instagram’s superinfluencers; though women in the top .1 percent for engagement (with at least 320 connections) outnumbered men (54% to 46%), the men were far more likely to be suggested to new users and expand their networks rapidly.

“Algorithms pick up subtle patterns and amplify them,” said the study’s senior author, Augustin Chaintreau, a computer scientist at Columbia Engineering and a member of Columbia’s Data Science Institute. “We’re not asking that algorithms be blind to the data, just that they correct their own tendency to magnify the bias already there.”

The question is whether this is possible, given the patterns we’ve set.

Share

Written By Liesl Goecker

Liesl Goecker is The Swaddle’s managing editor and has been living and writing in Mumbai since 2010. She is passionate about women’s rights, everyone’s health, and caffeine.

Share

Leave a Comment

Your email address will not be published. Required fields *.

Exclusive news delivered to your inbox.