How to End Gender Bias in Internet Algorithms

Come porre fine ai pregiudizi di genere negli algoritmi di Internet

Scopus indexed articles for several terms related to the genre. Credit: Algorithms (2022). DOI: 10.3390/a15090303

Endless screeds have been written about the fact that the internet algorithms we constantly interact with suffer from gender bias, and all you have to do is do a simple search to see for yourself.

However, according to the researchers behind a new study seeking to reach a conclusion on this matter, “until now, the debate has not included any scientific analysis.” This new article, by an interdisciplinary team, proposes a new way of approaching this issue and suggests some solutions to prevent these deviations in data and the discrimination they entail.

Algorithms are being used more and more to decide whether to grant a loan or accept applications. As the range of uses of artificial intelligence (AI), as well as its capabilities and importance, increases, it becomes increasingly vital to evaluate any biases associated with these operations.

‘Although this is not a new concept, there are many cases where this issue has not been investigated, thus ignoring the potential consequences,’ said the researchers, whose study, published in open access in the journal Algorithms magazine, mainly focused on Genre biases in the different fields of AI.

Such biases can have a huge impact on society: “Prejudice affects anything that is discriminated against, excluded, or associated with a stereotype. For example, a gender or a race may be excluded in a decision process or, simply, some behaviors can be engaged in because of one’s gender or the color of one’s skin,” explained the research’s principal investigator, Juliana Castañeda Jiménez, an industrial doctoral student at the Universitat Oberta de Catalunya (UOC) under the supervised by Ángel A. Juan, of the Polytechnic University of Valencia, and Javier Panadero, of the Polytechnic University of Catalonia.

According to Castañeda, “it is possible for algorithmic processes to discriminate based on gender, even if programmed to be ‘blind’ to this variable.”

The research group – which also includes researchers Milagros Sáinz and Sergi Yanes, both from the Gender and ICT research group (GenTIC) of the Internet Interdisciplinary Institute (IN3), Laura Calvet, from the Salesian University School of Sarrià, Assumpta Jover, from the Universitat de València, and Ángel A. Juan, illustrate this with a series of examples: the case of a well-known recruitment tool which preferred male candidates to female ones, or that of some credit services which offered less favorable conditions to women than to men .

“If old and biased data is used, you are likely to see negative bias regarding Black, gay, and even female demographics, depending on when and where the data comes from,” Castañeda explained.

The sciences are for boys and the arts are for girls

To understand how these patterns affect the different algorithms we deal with, the researchers analyzed previous work that identified gender biases in data processing in four types of AI: those describing applications in natural language processing and generation , decision management, speech recognition and facial recognition recognition.

In general, they found that all algorithms identified and classified white men better. They also found that they reproduced false beliefs about physical attributes that should define someone according to their biological sex, ethnic or cultural background, or sexual orientation, and also that they created stereotypical associations linking men to the sciences and women to the arts.

Many of the procedures used in image and voice recognition are also based on these stereotypes: cameras find it easier to recognize white faces and audio analysis has problems with higher-pitched voices, which mainly affect women.

The cases most likely to suffer from these problems are those whose algorithms are built based on the analysis of real-life data associated with a specific social context. “Some of the main causes are the underrepresentation of women in the design and development of AI products and services and the use of gender biased datasets,” noted the researcher, who argued that the problem stems from cultural environment in which they are developed.

“A algorithm, when trained with biased data, can detect hidden patterns in society and, when it operates, reproduce them. So, if men and women are unequally represented in society, the design and development of AI products and services will show gender bias.”

How can we put an end to all this?

The numerous sources of gender bias, as well as the peculiarities of any given type of algorithm and data set, mean that eliminating this bias is a very tough, though not impossible, challenge.

“Designers and all others involved in their design must be aware of the possibility of the existence of biases associated with the logic of an algorithm. In addition, they must understand the measures available to minimize, as far as possible, potential biases and implement in a way that do not occur, because if they are aware of the types of discrimination that occur in society, they will be able to identify when the solutions they develop reproduce them,” Castañeda suggested.

This work is innovative because it was carried out by specialists in several areas, including a sociologist, an anthropologist and experts in gender and statistics. “Team members provided a perspective that went beyond the self-contained mathematics associated with algorithms, thus helping us to see them as complex socio-technical systems,” said the study’s principal investigator.

“When comparing this work with others, I think it is one of the few that presents the problem of bias in algorithms from a neutral point of view, highlighting both social and technical aspects to identify why an algorithm might make a biased decision “, he added. concluded.

More information:
Juliana Castaneda et al, Addressing gender bias issues in algorithmic data processing: A socio-statistical perspective, Algorithms (2022). DOI: 10.3390/a15090303

Provided by Universitat Oberta de Catalunya (UOC)

Citation: How to End Gender Bias in Internet Algorithms (2022, Nov 23) Retrieved Nov 24, 2022 from

This document is subject to copyright. Except in all propriety for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.