Computer facial recognition, a technology involved in ethical and political debates

by time news

2023-10-09 11:45:34

Although computer facial recognition is used in issues related to security, it is also true that two of its big problems are the threat it poses to people’s privacy and the danger it has of becoming a tool of social control.

A group of 120 non-governmental organizations and 60 technology and privacy experts asked to suspend the use of facial recognition in public spaces and on immigrants, since this use can lead to abuse and discrimination. Although the letter was promoted by the European digital rights defense group, it had the support of organizations around the world. Although facial recognition is one of the rough points of the Artificial Intelligence Law that the European Union is working on, the problem of this technology that includes digital rights and privacy is replicated in many parts of the world.

Facial recognition is used daily to log in to cell phones (mobile phones) and validate identity to install banking or official agency applications. It is even enough to take a selfie and show your ID so that many entities will grant the person credit or allow them to make purchases without following traditional mechanisms.

Although there are applications for the identification of lost and missing people, computer facial recognition is mostly used to monitor the identity of people entering and leaving a country or circulating in public spaces, in order to provide “greater security” to the population.

Although various social sectors try to protect privacy from the use of computer facial recognition, there are more and more technologies that capture faces, voices and movements.

“People try to protect themselves, but in reality we have other privacy leaks, which are cell phones. It always seems casual to us that an advertisement appears for a place we would like to go and about which we spoke with someone a while before. Mobile phones are capturing our information all the time and even selling it,” says Pablo Negri, researcher at the National Council for Scientific and Technical Research (CONICET) at the Institute of Computer Sciences of the University of Buenos Aires (UBA) in Argentina, in dialogue with the Scientific News Agency of the National University of Quilmes, Argentina.

Security or privacy?

Security is one of the topics that reigns in all political debates and facial recognition is offered as a tool to detect those people considered dangerous. In this sense, one of the debates is to what extent the population is willing to lose privacy to gain security.

“At this point, the question lies in proportional use in certain risky cases for society or for people. Surely there are societies more permeable to this type of use that prefer to be more secure, and others that prefer to opt for privacy,” highlights Negri.

Despite the debate between both positions, the danger is falling into surveillance and social control through this tool. In fact, the Office of the Administrative Investigations Prosecutor filed a criminal complaint against the authorities of the Autonomous City of Buenos Aires for alleged illegal espionage with cameras and biometric data on officials, social and union leaders, opponents, judges, journalists and businessmen.

One of the controversies surrounding computer facial recognition is linked to its imprecision with different social groups. (Illustration: Amazings/NCYT)

Technology is not objective either

Although these computer programs are often thought to be neutral and efficient, those who design them are still flesh and blood people who carry their prejudices, stereotypes and errors. Called technological bias, the problem lies in the construction of the databases on which such systems are based. The first, which were launched a few years ago, were made up of millions of photos of famous people, mostly from the United States and Europe.

In this way, a preponderance of white, Western and male people was generated. Although the margin of error for these traits is minimal, there are other ethnicities and genders that were left out of the precise validation.

“This leads to more false alarms with people who are not white and European. Thus, false alarms are generated where a person who, according to the thresholds of the system, is similar, can be detained on public roads,” says the scientist who created Siface: a facial recognition system to locate lost and fugitive individuals.

Argentina, Brazil and the United States have more than one example where the facial recognition error rate is higher in black people, women or trans people, than in white men. According to a report by the Network in Defense of Digital Rights, “these errors reach almost 40 percent when it comes to racialized women or trans people, while, in white men, the error rate drops to only 0.3 percent”.

For this reason, it is necessary to generate other databases that are more inclusive and technologies that are capable of recognizing all types of faces and all skin types with equal effectiveness. (Source: Nicolás Retamar / Scientific News Agency of the National University of Quilmes)

#Computer #facial #recognition #technology #involved #ethical #political #debates

You may also like

Leave a Comment