Researchers have discovered a tendency towards gender stereotypes in AI – 2024-03-29 11:48:22

by times news cr

2024-03-29 11:48:22

German scientists from the University of Mannheim and the Leibniz Institute for Social Sciences have concluded that ChatGPT and other artificial intelligence (AI) systems based on large language models exhibit personality traits that can be determined using psychological tests. The study was published in the scientific journal Perspectives on Psychological Science (PPS), Day.Az reports with reference to Gazeta.ru.

Experts used generally accepted methods for assessing personal qualities used to study people’s characters.

Experiments have shown that some AI models were prone to reproducing gender stereotypes. For example, when filling out special questionnaires to determine the main values, the neural network chose the option “achievements and merits” if the text of the questionnaire was aimed at a man. In the “female” version of the test, the AI ​​indicated “security” and “traditions” as the main benefits.

According to the researchers, this suggests that neural networks cannot yet be considered an impartial party and its conclusions cannot be trusted on some issues.

“This can have far-reaching consequences for society. For example, language models are increasingly used in applied processes. If the machine is biased, this affects the assessment of candidates,” noted one of the authors of the scientific work, data and cognitive science specialist (the study of cognition) Max Pellert.

You may also like

Leave a Comment