An analysis of 630 billion words published online suggests that people tend to think of men when using gender-neutral terms, a sexist bias that can be learned by AI models. doing
April 1, 2022
According to an analysis of billions of words published online, when people use gender-neutral words such as “person” and “humanity,” they reflect the gender discrimination that exists in many societies. I tend to think of men rather than women.The researchers behind the job warn that this sexist prejudice is inherited by artificial intelligence A model trained with the same text.
April Bailey New York University and its colleagues used statistical algorithms to analyze a collection of 630 billion words contained in 2.96 billion web pages collected in 2017. This includes informal texts from blogs and discussion forums, as well as more formal texts written by the media, companies and governments. , Mainly in English. They used an approach called word embedding. This derives the intended meaning of a word, depending on how often it occurs in relation to other words.
They agree that words such as “person,” “person,” and “humanity” fit the context of words such as “male,” “he,” and “male” better than words such as “female” and “she.” I found it used in the context of “she”. According to the team, words containing these genders are used in the same way as words that refer to men, so people may consider them more male in a conceptual sense. This reflects a male-dominated society. They explained the fact that men could be overrated as authors in the dataset and found that it did not affect the results.
According to the team, one of the unanswered questions is how dependent this is on English. Other languages, such as Spanish, contain explicit gender information that can change the outcome. The team also did not consider non-dual gender identities or distinguish between gender and gender biological and social aspects.
Bailey says finding evidence of sexist prejudice in English is not surprising. Previous studies have suggested that words such as “scientist” and “engineer” are more closely related to words such as “male” and “male” than “female.” “And” women “.However, she has a variety of collections of the same texts scrutinized by this study. AI tools It inherits this bias from language translation websites to conversational bots.
“It learns from us, and then we learn from it,” says Bailey. “And we are in this reciprocal loop and reflect it back and forth. It’s everyone’s own individual that I now ring my finger and think of a person as a man rather than a woman. If we were to magically remove the cognitive bias, it would be embedded, suggesting that our society still has this bias. With AI tools. “
Journal reference: Science Advances, DOI: 10.1126 / sciadv.abm2463
Details of these topics:
https://www.newscientist.com/article/2313911-when-people-say-people-online-they-may-mostly-be-thinking-about-men/?utm_campaign=RSS%7CNSNS&utm_source=NSNS&utm_medium=RSS&utm_content=home Sexism: When people say “people” online, they may be thinking primarily about men