A new, cross-linguistic study has found that languages predict people’s implicit biases — languages with greater gender biases tend to have speakers with greater gender biases.
Published in Nature Human Behaviour, the study examined 25 languages and 657,335 participants across 39 countries, to understand how “linguistic associations shape people’s implicit judgments.” The 25 languages studied by the researchers were: English, Hindi, Indonesian, Filipino, Malay, Dutch, Danish, Turkish, Arabic, Persian, Norwegian, Swedish, Korean, Finnish, Mandarin, Japanese, Italian, Hebrew, Portuguese, German, Croatian, French, Polish, Romanian, Spanish.
The researchers began their exploration on the premise that two major sources of information contribute to gender stereotypes: first, direct experience, e.g., observing that most nurses are women may lead someone to implicitly conclude that women are better suited for nursing; and second, language. And, this exploration led them to conclude: “to the extent that language is a source of information for forming cultural stereotypes, two people with similar direct experiences, but different linguistic experiences, may develop different stereotypes.”
Related on The Swaddle:
UN Receives Backlash For Promoting Gender Neutral Language Amid Pandemic
In order to determine the degree of gender bias implicit in specific languages, through distributional statistics, the researchers trained machine-learning models to examine texts in all the 25 languages to determine whether, in comparison to “woman,” “man” was more likely to co-occur with “career”/”professional”/”job”/”money.” In nearly all the languages studied, the researchers found a strong relationship between words related to men and words related to career, on the one hand; and words related to women and words related to family, on the other. But, the degree of these associations varied across languages. Then, after running implicit association tests to understand whether these stereotypes in languages influence their speakers’ psychological attitudes in terms of whether they associate men with paid work, and women with taking care of the home, and the family. The researchers concluded that languages that have gender stereotypes more strongly embedded in them do indeed have speakers with stronger gender stereotypes and biases.
“The consequences of these results are pretty profound. The results suggest that if you speak a language that is really biased then you are more likely to have a gender stereotype that associates men with career and women with family,” Molly Lewis, a cognitive scientist in the Department of Social and Decision Sciences at Carnegie Mellon University, who co-authored the study, told Women’s Agenda.
In the second part of the study, the researchers delved deeper into the structural aspects of languages as well. The first facet they analyzed was the presence of 20 gender-specific occupation terms, such as “waiter” and “waitress,” in each of the 25 languages. Employing the same machine-learning models used before, for each of these occupation terms, the researchers examined the linguistic gender association to males and females, i.e., the extent to which each occupation term was statistically associated with a specific gender within that language. “What we found was that languages that make more of those kind of gender distinctions in occupations were more likely to have speakers with a stronger gender stereotype,” Lewis explained to Scientific American in an interview.
Related on The Swaddle:
‘Brilliance’ Bias Favoring Men Over Women Affects Gender Parity At Workplaces
However, regarding the first part of the study, the researchers pointed out that the the languages whose distributional statistics exhibited stronger gender biases, could simply be a reflection of pre-existing gender-based associations entrenched in the culture of the country. At the same time, by reflecting these stereotypes, the languages play a role in furthering the stereotypes they reflect — “thereby constituting a causal influence on the associations people learn,” and becoming a vicious cycle that, ultimately, propagates gender bias.
“…the language that you’re speaking could be shaping your psychological stereotypes. I think this work tells us one mechanism whereby stereotypes are formed. And I think this gives us a hint of how we could possibly intervene and, ultimately, change people’s stereotypes. One promising future direction is changing which books are being read to children — or which digital media are being given to children. And that might alter the stereotypes developed,” Lewis suggested.