Microsoft and Google consume more energy than Nigeria

by time news

Listen to the audio version of the article

On the one hand, the increasingly precarious plans to reach carbon neutral goals in a few years. On the other, the boom in artificial intelligence, which has plunged the technology sector into new energy-hungry logics. And this is where recent studies on the consumption of tech giants come in, precisely in light of the AI ​​explosion. Studies that have found that in 2023, for example, Google and Microsoft – combined – consumed more energy than Nigeria (which has 224 million inhabitants), or Ireland. And individually more than nations such as Croatia, Jordan or Puerto Rico.

The Hunger of AI

But let’s take a step back. We were talking about artificial intelligence, which is certainly “guilty” of this ravenous demand for energy. The large data centers that run behind AI, after all, require massive doses of energy for their calculations. So much so that many carbon neutral projects have been put aside (according to a study published by Standard & Poor’s, the decommissioning of coal-fired electricity production was 40% lower than expected for 2023) waiting for better times.

According to a recent estimate from the Vrije Universiteit in Amsterdam, the entire artificial intelligence industry could consume between 85 and 134 Terawatt hours per year by 2027. And although the various GenAI models have already undergone interesting slimming treatments in terms of consumption, the estimates do not ease doubts about the long-term sustainability of this technology.

A recent study published on Medium found, for example, that training OpenAI’s GPT-4 used up to 62,000 megawatt-hours, enough energy to power 1,000 U.S. households over 5-6 years.

Microsoft and Google consume more energy than Nigeria

New chips needed

The point is that current chips are decidedly energy-hungry. Nvidia’s H100 microprocessor – which is the most sought-after and also used in the entire AI world (it powers ChatGPT and other GenAI systems) – consumes about 700 Watts. And a small data center has at least 400 of these chips inside (while a large one has even 8 thousand). An enormous amount of energy, which is transforming the need for less energy-hungry chips into an emergency. The risk, trumpeted by many, is that AI development could soon suffer setbacks because it is not sustainable.

You may also like

Leave a Comment