Artificial intelligence looks to justice but raises ethical doubts | Digital Transformation | Technology

by time news
Enisaurus

A woman appears at the entrance table of a court in Buenos Aires, Argentina. It is 10 a.m. on a cold Monday in August 2018. She arrives accompanied by her three children, who sit on the floor to play, while her mother talks with the employee. “I have come to file an injunction to collect the subsidy for homelessness,” slips the woman with some embarrassment. After a few questions, the answers to which end up on a paper form, the clerk asks for her ID and walks into the office. The lady decides to sit down and wait. She’s tired and she knows it will be months before this little snowball she just pushed turns into a chance to sleep indoors.

What he does not know is that, if his file is not resolved in the first and second instance, the Public Prosecutor’s Office will intervene as a last resort. And there, unlike the previous instances, the causes are resolved in minutes. How is it possible? Because they work with a computer system that uses artificial intelligence (AI). Her name is Prometea.

When a justice official from the aforementioned ministry takes the file into his hands, he will only have to answer, speaking or writing, the questions of a chat like WhatsApp. And, in exactly four minutes, you will have obtained the opinion, in addition to the relevant statistics for the case and links of interest to illustrate the decision. Then, the body’s jurists only review the procedure, print and sign. They will have completed in half an hour a job that usually takes months.

A model that interests in Spain

In October 2019, authorities from the modernization area of ​​the Spanish Ministry of Justice visited the offices of the Public Prosecutor’s Office of the City of Buenos Aires. The objective was to get to know Prometea, the computer system that is used in the Argentine capital to solve cases of diverse matters but simple resolution: minor infractions, traffic accidents or social policies, among others.

Sofía Duarte Domínguez, general director of the body formerly called Modernization of Justice in Spain —in January 2020 it became Digital Transformation of the Administration of Justice—, made statements in this regard in the Argentine press: “We have studied everything about Prometea , we know that it is a fabulous system and we want to see if we can take it to Spain. even the [entonces] Secretary of State for Justice, Manuel Dolz, gave us carte blanche to move forward with this, which is, without a doubt, the future of justice”.

The issue should not take us by surprise. A few days before the visit of the Spanish delegation to the Buenos Aires body, the professor of Law and Political Science at the Open University of Barcelona (UOC) David Martínez explained, in an article published by The vanguard, that AI could well be used in Spain in cases “of easy legal response”, which would decongest the traffic of judicial files. Despite the fact that Duarte Domínguez underlines that the digitization of the entire Ministry is fertile ground for automating justice, she herself warns that one of the main obstacles to the process lies in the resistance of judicial workers, who believe that information technology will take away the job.

In favor of automating justice

Martínez’s observations are in line with what some Argentine experts think, committed to the task of doing intelligent justice. This is the case of the judge of the Supreme Court of Justice of the province of Mendoza, Mario Adaro, who applies Prometea daily and recently participated in the first Ibero-American Summit on Artificial Intelligence, at the MIT headquarters (Boston). “AI has a capacity to process information in large volumes that shortens bureaucratic deadlines to a not insignificant extent because, usually, the greater the number of cases and the fewer decision-makers, the more time per case,” he tells EL PAÍS RETINA. “Using automatic processes, the judge has greater analysis capacity.”

The deputy attorney general of Buenos Aires, Juan G. Corvalán, created Prometea after having detected that, in half of the cases in which judicial personnel intervene, most of the time is used to verify personal data, information that is repeated , etc. Adaro illustrates this with the example of tax cases, pointing out that “they are serial sentences, of a large volume, where the decisions can be grouped into clear sets and everything is quite mechanical and predictable. By using IA for this type of problem, Prometea makes the number of errors in data loading, typing and redundancy drop significantly”, assures the judge from Mendoza.

“AI can process information in large volumes, which shortens bureaucratic deadlines”

Origin: United States

There are three emblematic cases of application of AI in justice, in addition to Prometea. The most famous is the Compas program (Correctional Offender Management Profiling for Alternative Sanctions), which is used in several US states. It is a software that has been used since 1998 to analyze, according to the criminal record of a defendant, his chances of reoffending. The program poses a questionnaire to the accused. Once he answers all the questions, the system calculates the risk of recidivism, so the judge defines, for example, whether or not to grant parole while the judicial process is completed.

Compas rose to fame with the Loomis case, in 2013. Accused of fleeing from the police and using a vehicle without the owner’s authorization, Eric Loomis received six years in prison and five years of probation because Compas estimated a high risk of recidivism . Loomis appealed, arguing that his defense could not refute Compas’s methods because the algorithm was not public. The Wisconsin State Supreme Court dismissed the appeal. Years later, in 2018, it was learned that the system analyzes 137 aspects of each defendant. But, when contrasting the level of success between Compas’s predictions and those of flesh and blood jurists, it was found that the level of success of the AI ​​is not higher, or even serious errors remain evident.

“Statistical averages say something about the patterns of common behavior in a group. They do not describe individual profiles and are incapable of capturing the singularity of the human being”, explains Lorena Jaume-Palasí, an expert in ethics and technology and founder of Algorithm Watch and The Ethical Tech Society. “With this we can understand groups with a slightly more architectural perspective, but we also incur the risk of putting individuals in standards that do not fit.”

To clarify whether it is feasible to prosecute someone criminally using AI, it is necessary to understand what criteria the algorithm uses (what Loomis’s defense claimed). Jaume-Palasí argues that, after all, Law is an algorithm that has been applied long before computer science existed. “[Con el caso Loomis] They have all put their eyes on the computer system and were scandalized by racism, but Compas allowed us to find out about the biases that the judges have, because the system was created by humans who had been working and deciding with those biases that the program later revealed.

“Statistical averages help to understand groups, but they are incapable of capturing the singularity of the human being”

Is it Prometea as Compas?

In addition to his position in Justice, Juan G. Corvalán is director of the Innovation and Artificial Intelligence Laboratory of the Law School of the University of Buenos Aires. In 2017, he created the Prometea software together with his collaborators.

Corvalán highlights, among the qualities of the system, that “Prometea does not use black box AI techniques, or what is known as deep learning; that is, the entire algorithm process is open, auditable and traceable.” Compas, on the other hand, applies two neural networks whose operation is unknown because “it was developed by a private company that holds the intellectual property rights of the algorithm.”

The Argentine software, Corvalán maintains, does nothing but reproduce the proceeding of the country’s judiciary. “Promethea’s predictions are based on the analysis of the history of what the judges have decided, they are the ones who train the system. For example, in the Constitutional Court of Colombia [país en el que también se aplica el programa] it is the magistrates themselves who carry out the permanent adjustment of Prometea’s predictions, with our technical assistance, of course”.

Databases and biases

There is no AI that is worth without data. And when talking about data, the ghost of bias appears, such as the racism of which Compas has been accused. The numbers build a discourse of objectivity that prevents, sometimes, questioning decisions. “Algorithms are nothing more than opinions enclosed in mathematics”, Cathy O’Neil wrote in her famous Weapons of mathematical destruction.

“What algorithms undoubtedly allow is to standardize decisions. In other words, to standardize criteria so that two different responses are not given to the same problem,” says Pablo Mlynkiewicz, a graduate in Statistics and former head of the General Directorate of Information Sciences in Buenos Aires. “But, of course, for that to translate into real progress in justice, the database must have representation from all groups. If not, there will be mistakes.”

Mlynkiewicz agrees, in this way, with Jaume-Palasí and with Adaro when highlighting a strong point in favor of the automation of judicial processes: they avoid giving two different answers to the same problem. That is, argumentative consistency is provided in the failures. Even though she is the most critical of these systems, the Majorcan philosopher admits that automating judicial processes based on statistics can help correct errors that justice today refuses to accept. “We have known for a long time that the judges and the judicial system that we know are not very consistent. Being able to make traceability and statistics of judicial decisions thanks to AI is not bad at all”, she emphasizes.

Robot Judges in China

In October 2019, the Internet Court, defined as an “online litigation center”, was launched in Beijing. According to official information, it is a platform in which the parties upload the data of the problem to be solved and the AI ​​does the rest: it searches for jurisprudence, analyzes the issue, contrasts evidence and issues a sentence.

The system does not have major technical differences with that of Estonia, where there is also a strong commitment to the automation of justice: there is no human intervention in the entire process. But between both countries there is a great distance in democratic standards. In the small Baltic country, considered the most advanced on the planet in digital matters, who directs the project is the young Ott Velsberg, who intends that the demands that are presented before the digital court do not exceed 7,000 euros as the amount claimed for damages.

Everything flows there, because it is a society with high standards in civic matters. But when talking about the Asian giant, things take on another tenor. “The development of virtual or cybernetic judges in China has followed the same line as the Social Credit System: from the bottom up,” explains Dante Avaro, a specialist in the Chinese government control model, referring to the controversial scoring mechanism of citizens launched by Beijing to determine whether or not they are trustworthy. “Both started at the beginning of the new millennium. In the case of AI in justice, it was experimented with in cities like Shandong, then in Hengezhou, Beijing and Guangzhou. The objective was to bring efficiency to judicial processes in matters of electronic commerce, virtual payments, cloud transactions and intellectual property disputes”, he illustrates.

The detail is that, in the hands of a non-democratic State that intends to order society by working transversally on a scoring that Avaro calls “citizen traceability”, the application of AI in justice is dangerous because it is linked to the Social Credit System and the Yitu Dragonfly Eye facial recognition system. “A huge state surveillance apparatus is being built,” concludes Avaro

You may also like

Leave a Comment