Author: Universitat Oberta de Catalunya (UOC)
Author Contact: uoc.edu/portal/en/index.html
Published: 24th Nov 2022 - Updated: 5th Jan 2023
Peer-Reviewed Publication: Yes
DOI: https://www.mdpi.com/1999-4893/15/9/303
Additional References: Gender Equality Publications
Summary: Many internet algorithms are based on stereotypes, leading them to associate the sciences with masculinity and the arts with femininity.
Gender bias is the tendency to prefer one gender over another. Gender bias is often a form of unconscious or implicit bias. It can happen when someone unintentionally attributes certain attitudes and stereotypes to someone else.
Endless screeds have been penned on whether the internet algorithms with which we constantly interact suffer from gender bias and all you need to do is carry out a simple search to see this for yourself. However, according to the researchers behind a new study that seeks to reach a conclusion on this matter, "until now, the debate has not included any scientific analysis." This recent article, by an interdisciplinary team, puts forward a new way of tackling the question and suggests some solutions for preventing these deviances in the data and the discrimination they entail.
Algorithms are being used more often to decide whether to grant a loan or accept applications. As the range of uses for artificial intelligence (AI) increases, as do its capabilities and importance, it becomes increasingly vital to assess any possible prejudices associated with these operations.
"Although it's not a new concept, there are many cases in which this problem has not been examined, thus ignoring the potential consequences," stated the researchers, whose study, published open-access in the Algorithms journal, focused mainly on gender bias in the different fields of AI.
"Biases affect everything that is discriminated against, excluded, or associated with a stereotype. For example, a gender or a race may be excluded in a decision-making process or, or certain behavior may be assumed because of one's gender or the color of one's skin," explained the principal investigator of the research, Juliana Castañeda Jiménez, an industrial doctorate student at the Universitat Oberta de Catalunya (UOC) under the supervision of Ángel A. Juan, of the Universitat Politècnica de València, and Javier Panadero, of the Universitat Politècnica de Catalunya.
According to Castañeda:
"it is possible for algorithmic processes to discriminate because of gender, even when programmed to be 'blind' to this variable."
(Article continues below image.)
(Continued...)
The research team - which also includes researchers Milagros Sáinz and Sergi Yanes, both of the Gender and ICT (GenTIC) research group of the Internet Interdisciplinary Institute (IN3), Laura Calvet, of the Salesian University School of Sarrià, Assumpta Jover, of the Universitat de València, and Ángel A. Juan - illustrate this with several examples: the case of a well-known recruitment tool that preferred male over female applicants, or that of some credit services that offered less favorable terms to women than to men.
"If old, unbalanced data are used, you're likely to see negative conditioning about black, gay, and even female demographics, depending upon when and where the data are from," explained Castañeda.
To understand how these patterns affect the different algorithms we deal with, the researchers analyzed previous works that identified gender biases in data processes in four kinds of AI: those that describe applications in natural language processing and generation, decision management, speech recognition, and facial recognition.
In general, they found that all the algorithms identified and classified white men better. They also found that they reproduce false beliefs about the physical attributes that should define someone depending on their biological sex, ethnic or cultural background, or sexual orientation. They also made stereotypical associations linking men with the sciences and women with the arts.
Many of the procedures used in image and voice recognition are also based on these stereotypes: cameras find it easier to recognize white faces, and audio analysis has problems with higher-pitched voices, mainly affecting women.
The cases most likely to suffer from these issues are those whose algorithms are built based on analyzing real-life data associated with a specific social context.
"Some of the main causes are the under-representation of women in the design and development of AI products and services, and the use of datasets with gender biases," noted the researcher, who argued that the problem stems from the cultural environment in which they are developed.
"An algorithm, when trained with biased data, can detect hidden patterns in society and, when operating, reproduce them. So if, in society, men and women have unequal representation, the design and development of AI products and services will show gender biases."
The many sources of gender bias and the peculiarities of each given type of algorithm and dataset mean that doing away with this deviation is a very tough - though not impossible - challenge.
"Designers and everyone else involved in their design need to be informed of the possibility of the existence of biases associated with an algorithm's logic. What's more, they need to understand the measures available for minimizing, as far as possible, potential biases, and implement them so that they don't occur, because if they are aware of the types of discrimination occurring in society, they will be able to identify when the solutions they develop reproduce them," suggested Castañeda.
This work is innovative because it has been carried out by specialists in different areas, including a sociologist, an anthropologist, and experts in gender and statistics.
"The team's members provided a perspective that went beyond the autonomous mathematics associated with algorithms, thereby helping us to view them as complex socio-technical systems," said the study's principal investigator.
"If you compare this work with others, I think it is one of only a few that present the issue of biases in algorithms from a neutral standpoint, highlighting both social and technical aspects to identify why an algorithm might make a biased decision," she concluded.
This UOC research fosters Sustainable Development Goals (SDG), Gender equality, and Reduced Inequalities.
Ending Gender Bias in Internet Algorithms | Universitat Oberta de Catalunya (UOC) (uoc.edu/portal/en/index.html). SexualDiversity.org makes no warranties or representations in connection therewith. Content may have been edited for style, clarity or length.
Post to Twitter Add to Facebook
Latest Gender Equality Publications | |
---|---|
The above information is from our reference library of resources relating to Gender Equality that includes: | |
![]() | Gender Equality Progress Is Stalling Gender equality recovers to pre-pandemic levels but pace of progress has slowed with Iceland remaining the most gender-equal country, followed by Norway, Finland, New Zealand and Sweden. Publish Date: 27th Jul 2023 |
![]() | Gender Inequality Among Scientific Editors Women are consistently underrepresented among editors, and female editors are less likely to publish their research in the journals they edit. Publish Date: 16th Jan 2023 |
![]() | Men Deterred from HEED Careers Due to Male Gender Bias Bias against men in health care, early education, and domestic (HEED) fields have been documented, and the current study sought to gauge the impact of that bias. Publish Date: 24th Dec 2022 |
![]() | Ending Gender Bias in Internet Algorithms Many internet algorithms are based on stereotypes, leading them to associate the sciences with masculinity and the arts with femininity. Publish Date: 24th Nov 2022 - Updated: 5th Jan 2023 |
1How Many Genders Are There?
Alphabetical list of gender identities.
2Transgender Reporting Guide
How to write about transgender people.
3Glossary of Sexuality Terms
Definitions of sexual terms & acronyms.
4Glossary of Gender Terms
Definitions of gender related terms.
5Am I Gay? Questions to Ask
Think you may be gay or bisexual?
• Submissions: Send us your coming events and LGBTQ related news stories.
• Report Errors: Please report outdated or inaccurate information to us.
• (APA): Universitat Oberta de Catalunya (UOC). (2022, November 24). Ending Gender Bias in Internet Algorithms. SexualDiversity.org. Retrieved September 23, 2023 from www.sexualdiversity.org/discrimination/equality/1100.php
• Permalink: <a href="https://www.sexualdiversity.org/discrimination/equality/1100.php">Ending Gender Bias in Internet Algorithms</a>