the dark side of artificial intelligence

by time news

2023-05-01 17:45:04

The Agbogbloshie electronic waste dump, near Accra (Ghana), in 2010. In his series

By dint of showing their abilities with invented images or virtual conversations, contemporary artificial intelligence (AI) technologies end up being noticed also on the environmental front. These computer racing beasts are also energy guzzlers, in particular because of a propensity to grow enormously in size. “There have already been two breaks in the growth slopes. From 2012, with the arrival of deep learning, then in 2017 and the advent of large language models whose number of parameters exceeds one hundred billion”summarizes Anne-Laure Ligozat, professor at the National School of Computing for Industry and Business (Ensiie), in Evry-Courcouronnes (Essonne).

But it was not until 2019 that a study quantified the effects of this increase for the first time. Emma Strubell of the University of Massachusetts Amherst estimates that the then most common natural language processing model, BERT, used 256 kilograms of CO equivalent2 to be worked out, which is the equivalent of a Paris-Hong Kong trip by plane.

It didn’t work out. Anne-Laure Ligozat thus assessed, in a November 2022 prepublication, that learning the international model called Bloom on the supercomputers of the National Intensive Computing Equipment (Genci) required 24.4 tonnes of CO2 (for 118 days of calculations), i.e. a hundred times more…

Read also: Article reserved for our subscribers Research to the challenge of digital energy sobriety

But these figures are underestimated. Or more exactly, they forget many big “details”. For example, data centers also consume, even when no calculation is performed. And almost as much as in operational operation. In addition, all these components had to be manufactured, which also has an effect on the environment. In total, estimates the researcher, we would be beyond 50 tons, double the first estimate, or almost fifty times around the Earth by plane.

Gaps in knowledge

And again, this is only an estimate since, as the computer science professor explains, several data are missing or imprecise. Thus, certain equipment, such as air conditioning or inverters, to correct faults in power supplies, are absent from the databases allowing the carbon footprint of their manufacture to be estimated, because they are almost unique parts. For others, the manufacturers do not provide the data.

Another downside, these calculations only correspond to part of the subject. These language models must be used for something, for example to make conversation agents like ChatGPT, which will also generate a lot of requests and calculations. What is called inference can weigh as much as learning. For Bloom, already quoted, over eighteen days at an average of 558 requests per day, the consumption was equivalent to 19 kilograms of CO2 per day.

You have 41.07% of this article left to read. The following is for subscribers only.

#dark #side #artificial #intelligence

You may also like

Leave a Comment