Screen Readers Skip to Content

Ending Gender Bias in Internet Algorithms

Author: Universitat Oberta de Catalunya (UOC)
Author Contact: uoc.edu/portal/en/index.html
Published: 24th Nov 2022
Peer-Reviewed Publication: Yes
DOI: https://dx.doi.org/10.3390/a15090303
Additional References: Gender Equality Publications

Summary: Many internet algorithms are based on stereotypes, leading them to associate the sciences with masculinity and the arts with femininity.

Definition

Gender Bias

Gender bias is the tendency to prefer one gender over another. Gender bias is often a form of unconscious or implicit bias. It can happen when someone unintentionally attributes certain attitudes and stereotypes to someone else.

Main Document

Endless screeds have been penned on whether the internet algorithms with which we constantly interact suffer from gender bias and all you need to do is carry out a simple search to see this for yourself. However, according to the researchers behind a new study that seeks to reach a conclusion on this matter, "until now, the debate has not included any scientific analysis." This recent article, by an interdisciplinary team, puts forward a new way of tackling the question and suggests some solutions for preventing these deviances in the data and the discrimination they entail.

Algorithms are being used more often to decide whether to grant a loan or accept applications. As the range of uses for artificial intelligence (AI) increases, as do its capabilities and importance, it becomes increasingly vital to assess any possible prejudices associated with these operations.

"Although it's not a new concept, there are many cases in which this problem has not been examined, thus ignoring the potential consequences," stated the researchers, whose study, published open-access in the Algorithms journal, focused mainly on gender bias in the different fields of AI.

Such Prejudices Can Have a Huge Impact Upon Society

"Biases affect everything that is discriminated against, excluded, or associated with a stereotype. For example, a gender or a race may be excluded in a decision-making process or, or certain behavior may be assumed because of one's gender or the color of one's skin," explained the principal investigator of the research, Juliana Castañeda Jiménez, an industrial doctorate student at the Universitat Oberta de Catalunya (UOC) under the supervision of Ángel A. Juan, of the Universitat Politècnica de València, and Javier Panadero, of the Universitat Politècnica de Catalunya.

According to Castañeda:

"it is possible for algorithmic processes to discriminate because of gender, even when programmed to be 'blind' to this variable."

gender-bias.jpggender-bias.jpg

The research team - which also includes researchers Milagros Sáinz and Sergi Yanes, both of the Gender and ICT (GenTIC) research group of the Internet Interdisciplinary Institute (IN3), Laura Calvet, of the Salesian University School of Sarrià, Assumpta Jover, of the Universitat de València, and Ángel A. Juan - illustrate this with several examples: the case of a well-known recruitment tool that preferred male over female applicants, or that of some credit services that offered less favorable terms to women than to men.

"If old, unbalanced data are used, you're likely to see negative conditioning about black, gay, and even female demographics, depending upon when and where the data are from," explained Castañeda.

The Sciences are for Boys, and the Arts are for Girls

To understand how these patterns affect the different algorithms we deal with, the researchers analyzed previous works that identified gender biases in data processes in four kinds of AI: those that describe applications in natural language processing and generation, decision management, speech recognition, and facial recognition.

In general, they found that all the algorithms identified and classified white men better. They also found that they reproduce false beliefs about the physical attributes that should define someone depending on their biological sex, ethnic or cultural background, or sexual orientation. They also made stereotypical associations linking men with the sciences and women with the arts.

Many of the procedures used in image and voice recognition are also based on these stereotypes: cameras find it easier to recognize white faces, and audio analysis has problems with higher-pitched voices, mainly affecting women.

The cases most likely to suffer from these issues are those whose algorithms are built based on analyzing real-life data associated with a specific social context.

"Some of the main causes are the under-representation of women in the design and development of AI products and services, and the use of datasets with gender biases," noted the researcher, who argued that the problem stems from the cultural environment in which they are developed.

"An algorithm, when trained with biased data, can detect hidden patterns in society and, when operating, reproduce them. So if, in society, men and women have unequal representation, the design and development of AI products and services will show gender biases."

How Can We Put an End to This?

The many sources of gender bias and the peculiarities of each given type of algorithm and dataset mean that doing away with this deviation is a very tough - though not impossible - challenge.

"Designers and everyone else involved in their design need to be informed of the possibility of the existence of biases associated with an algorithm's logic. What's more, they need to understand the measures available for minimizing, as far as possible, potential biases, and implement them so that they don't occur, because if they are aware of the types of discrimination occurring in society, they will be able to identify when the solutions they develop reproduce them," suggested Castañeda.

This work is innovative because it has been carried out by specialists in different areas, including a sociologist, an anthropologist, and experts in gender and statistics.

"The team's members provided a perspective that went beyond the autonomous mathematics associated with algorithms, thereby helping us to view them as complex socio-technical systems," said the study's principal investigator.

"If you compare this work with others, I think it is one of only a few that present the issue of biases in algorithms from a neutral standpoint, highlighting both social and technical aspects to identify why an algorithm might make a biased decision," she concluded.

This UOC research fosters Sustainable Development Goals (SDG), Gender equality, and Reduced Inequalities.

References and Source(s):

Ending Gender Bias in Internet Algorithms | Universitat Oberta de Catalunya (UOC) (uoc.edu/portal/en/index.html). SexualDiversity.org makes no warranties or representations in connection therewith. Content may have been edited for style, clarity or length.

Post to Twitter Add to Facebook

Latest Gender Equality Publications

The above information is from our reference library of resources relating to Gender Equality that includes:

Thumbnail image: gender-bias.jpg
Many internet algorithms are based on stereotypes, leading them to associate the sciences with masculinity and the arts with femininity.
Author: Universitat Oberta de Catalunya (UOC)
Publish Date: 24th Nov 2022
Thumbnail image: The shift to remote working in March 2020 caused by the COVID-19 pandemic has raised many questions on the future of work. A new report looks at who has benefited from remote and hybrid work models and what organizations and governments can do to ensure those currently disadvantaged by the current models can also benefit - Image Credit: Institute for Gender and the Economy, Rotman School of Management.
A new report from Institute for Gender and the Economy looks at remote work, the pursuit of equality, and the future of work.
Author: University of Toronto - Rotman School of Management
Publish Date: 10th Nov 2022

Research finds even as they achieve more job power and capability; middle-aged women can be held back by a perceived lack of niceness.
Author: University of California - Berkeley Haas School of Business
Publish Date: 21st Oct 2022

University of Southern California Information Sciences Institute researchers used artificial intelligence to study gender disparities in science.
Author: University of Southern California
Publish Date: 5th Oct 2022


1LGBTQ+ Awareness and Events
LGBTQ+ awareness dates and events.

2Transgender Reporting Guide
How to write about transgender people.

3Glossary of Sexuality Terms
Definitions of sexual terms & acronyms.

4Glossary of Gender Terms
Definitions of gender related terms.

5Am I Gay? Questions to Ask
Think you may be gay or bisexual?

• Submissions: Send us your coming events and LGBTQ related news stories.


• Report Errors: Please report outdated or inaccurate information to us.



• (APA): Universitat Oberta de Catalunya (UOC). (2022, November 24). Ending Gender Bias in Internet Algorithms. SexualDiversity.org. Retrieved December 2, 2022 from www.sexualdiversity.org/discrimination/equality/1100.php


• Permalink: <a href="https://www.sexualdiversity.org/discrimination/equality/1100.php">Ending Gender Bias in Internet Algorithms</a>