How to put an end to gender biases in internet algorithms

Wie man geschlechtsspezifischen Vorurteilen in Internetalgorithmen ein Ende setzt

Scopus-indexed articles for various gendered terms. Recognition: algorithms (2022). DOI: 10.3390/a15090303

Infinite screeds have been written about whether the internet algorithms we constantly interact with suffer from gender bias, and all you have to do is perform a simple search to see for yourself.

According to the researchers behind a new study designed to draw a conclusion on the matter, “however, the debate lacks scientific analysis as of now.” This new article by an interdisciplinary team proposes a new approach to the question and proposes some solutions to prevent these discrepancies in the data and the discrimination they cause.

Algorithms are making more and more decisions about granting a loan or accepting applications. As the range of uses of artificial intelligence (AI), as well as its capabilities and importance, increases, it becomes increasingly important to assess possible biases associated with these operations.

“Although not a new concept, there are many instances where this issue has not been studied and thus the potential implications have been ignored,” said the researchers, whose study was published open access algorithms Magazine that mainly focuses on gender bias in the different areas of AI.

Such prejudice can have an enormous impact on society: “Prejudice concerns everything that is discriminated against, excluded or associated with a stereotype. For example, a gender or a race can be excluded in a decision-making process, or simply adopt certain behaviors based on one’s gender or skin color,” explained the research’s lead investigator, Juliana Castañeda Jiménez, an industrial PhD student at the Universitat Oberta de Catalunya (UOC). led by Ángel A. Juan from the Universitat Politècnica de Valencia, and Javier Panadero from the Universitat Politècnica de Catalunya.

According to Castañeda, “It is possible for algorithmic processes to discriminate on the basis of sex, even if they are programmed to be ‘blind’ to that variable.”

The research team – which also includes researchers Milagros Sáinz and Sergi Yanes, both from the Research Group Gender and ICT (GenTIC) of the Interdisciplinary Internet Institute (IN3), Laura Calvet, from the Salesian University School of Sarrià, Assumpta Jover, of the Universitat de València and Ángel A. Juan – illustrate this with some examples: the case of a well-known recruitment tool that favored male applicants over female, or the case of some credit service providers that offered women worse conditions than men.

“If old, imbalanced data is used, you’re likely to see negative conditioning in terms of black, gay, and even female demographics, depending on when and where the data came from,” Castañeda explained.

Science is for boys and the arts are for girls

To understand how these patterns affect the various algorithms we study, the researchers analyzed previous work that identified gender biases in data processes in four types of AI: those that have applications in natural language processing and generation, Decision management, speech recognition and face recognition describe recognition.

In general, they found that all algorithms identified and classified white males better. They also found that they reproduced false beliefs about the physical traits that should define someone based on their sex, ethnic or cultural background, or sexual orientation, and that they also made stereotypical associations that males had with science and females with science connected to the arts.

Many methods of image and speech recognition are also based on these clichés: cameras recognize white faces better and audio analysis has problems with higher-pitched voices, which mainly affect women.

The cases most likely to suffer from these problems are those whose algorithms were created based on the analysis of real data linked to a specific social context. “Some of the main causes are the underrepresentation of women in the design and development of AI products and services, and the use of gender bias datasets,” noted the researcher, who argued that the problem stems from the cultural environment in which they are they are developed.

“When an algorithm is trained on biased data, it can detect hidden patterns in society and reproduce them in operation. So if men and women are unequally represented in society, the design and development of AI products and services will exhibit gender bias.”

How can we put an end to this?

The many sources of gender bias, as well as the idiosyncrasies of any given algorithm and dataset type, mean that eliminating this bias is a very difficult – though not impossible – challenge.

“Designers and everyone else involved in their design must be made aware of the possibility of the presence of bias associated with an algorithm’s logic. In addition, they must understand the measures available to minimize and implement potential biases as much as possible so that they do not occur, because being aware of the types of discrimination that occur in society will enable them to recognize when those of them developed solutions reproduce them,” Castañeda suggested.

This work is innovative because it was carried out by specialists from different fields, including a sociologist, an anthropologist and experts in gender and statistics. “Team members provided a perspective that went beyond the autonomous mathematics associated with algorithms, thereby helping us to view them as complex sociotechnical systems,” said the study’s lead researcher.

“If you compare this work to others, I think it is one of the few that presents the problem of bias in algorithms from a neutral point of view, emphasizing both social and technical aspects to identify why an algorithm is biased.” could make a decision,” she said finally.

More information:
Juliana Castaneda et al, Dealing with Gender Bias Issues in Data-Algorithmic Processes: A Social-Statistical Perspective, algorithms (2022). DOI: 10.3390/a15090303

Provided by Universitat Oberta de Catalunya (UOC)

Citation: How to put an end to gender biases in internet algorithms (2022, November 23), retrieved November 23, 2022 from https://techxplore.com/news/2022-11-gender-biases-internet-algorithms.html

This document is protected by copyright. Except for fair trade for the purpose of private study or research, no part may be reproduced without written permission. The content is for informational purposes only.