AI Reasoning Models vs. LLMs: CO₂ Emissions Soar

by Grace Chen

2025-06-19 12:10:00

AI’s carbon Cost: More Accuracy, Bigger Footprint

AI models, while becoming more precise, are also consuming a surprising amount of energy, leading to a critically important increase in carbon emissions.

  • More accurate AI models can produce up to 50 times more carbon dioxide emissions than less precise ones.
  • Reasoning models, which aim for higher accuracy, significantly increase energy consumption.
  • The environmental impact varies based on the specific model and the complexity of the task.

Are you ready for a surprising truth? Some advanced artificial intelligence models are creating a bigger environmental impact than you might think. A new study reveals that the drive for more accurate AI responses comes at a cost: a dramatically larger carbon footprint.

Reasoning models, such as Anthropic’s Claude, OpenAI’s o3, and deepseek’s R1, are designed to provide more precise answers. These models use significant computing power and time to achieve greater accuracy than their predecessors.Tho, the quest for precision has an environmental price.

The study, published June 19 in the journal Frontiers in Communication, highlights the environmental impacts of these models. Researchers found that the environmental impact of using trained language models is heavily influenced by how they approach reasoning.

Chain-of-Thoght (CoT): This technique allows AI models to break down complex problems into smaller, more manageable steps, mimicking human reasoning. While it boosts accuracy, it also significantly increases energy consumption.

“The environmental impact of questioning trained LLMs is strongly persistent by their reasoning approach, with explicit reasoning processes significantly driving up energy consumption and carbon emissions,” saeid Maximilian Dauner, a researcher at Hochschule München University of Applied Sciences in Germany.

these models use “chain-of-thought,” a technique that breaks down complicated problems into smaller, logical steps. this process helps mimic human reasoning and leads to more accurate results.But it also demands more energy.

These models have significantly higher energy demands than conventional ones, which could create economic hurdles for companies.While research into the environmental effects of AI adoption grows, comparisons between the carbon footprints of various models are still relatively rare.

the Cost of Reasoning

To study the CO emissions, scientists asked 14 diffrent language models 1,000 questions across various topics. The models had between 7 and 72 billion parameters. The team then converted energy usage into CO by assuming each kilowatt-hour of energy produces 480 grams of CO.

CO Conversion: The study assumed that each kilowatt-hour (kWh) of energy used by the AI models produces 480 grams of CO2. This conversion helps quantify the environmental impact of AI’s energy consumption.

The results indicated that reasoning models generated 543.5 tokens per question, compared to only 37.7 tokens for less complex models. These extra tokens, requiring more computation, meant higher CO emissions for the more precise reasoning models. The most accurate model, the 72-billion parameter Cogito model, answered 84.9% of the questions correctly but released three times the CO emissions of models designed for concise answers.

“Currently,we see a clear accuracy-sustainability trade-off inherent in LLM technologies,” said Dauner. “None of the models that kept emissions below 500 grams of CO equivalent [total greenhouse gases released] achieved higher than 80% accuracy on answering the 1,000 questions correctly.”

Emissions spiked even higher for questions needing more reasoning time,like algebra or philosophy,sometimes six times more than simple look-up queries.Calculations also showed that the emissions depended on the specific models used. to answer 60,000 questions, DeepSeek’s 70-billion parameter R1 model would produce the same amount of CO as a round-trip flight from New York to London.

DeepSeek’s R1: Answering 60,000 questions with this 70-billion parameter model produces the same amount of CO2 as a round-trip flight from New York to London, highlighting the significant environmental cost of complex AI tasks.

The study’s conclusions are not definitive, and emissions can change based on hardware and energy sources. however,the researchers emphasize that these findings should encourage AI users to consider the environmental impact before using this technology.

“If users know the exact CO cost of their AI-generated outputs,such as casually turning themselves into an action figure,they might be more selective and thoughtful about when and how they use these technologies,” Dauner said.

Beyond the Numbers: Shifting the Focus to Sustainability

As the environmental cost of AI becomes clearer, the conversation expands beyond simply quantifying *carbon emissions*. The next step is about the *sustainability* of these advanced technologies. Given that AI’s impact depends on the interplay of factors, the focus is on how to make these models more efficient and environmentally responsible.

One key aspect is the energy grid’s cleanliness. AI’s carbon footprint varies greatly depending on the source of the energy fueling the computing infrastructure [[2]]. Using renewable energy sources considerably reduces this impact. Encouraging or mandating data centers to use renewable energy is a crucial step towards a more sustainable AI ecosystem.

Researchers are actively exploring the development of more *energy-efficient* AI models.The aim is to create models that require less computing power and consume less energy without sacrificing accuracy. This can involve optimizing model architectures, using specialized hardware, and developing more efficient training methods.

the carbon footprint of AI is influenced by the efficiency of the model and the energy source used to power it. Data centers’ location influences the environmental impact of the technology.

The choice of hardware plays a important role.Some hardware is designed to be more energy-intensive. in this case, organizations investing in AI should consider specialized processors and hardware that are designed for low energy consumption while still offering high performance.

The Power of Optimization: Researchers are working on methods to reduce the number of parameters in AI models or use techniques like quantization.This shrinks the environmental impact of the process without harming results.

Furthermore, the location of a data center matters, too [[2]]. Data centers in regions with cleaner energy grids will have a lower carbon footprint than those in areas heavily reliant on fossil fuels. This highlights the importance of geographical considerations when deploying and using AI technologies.

Beyond technical solutions,user habits matter as well. As suggested by Dauner, users using and integrating AI models need to become more mindful of their usage patterns.They can make better choices that reduce the environmental impact.

Practical Steps to Reduce AI’s carbon Footprint

Individuals and organizations can implement various strategies to reduce the environmental impact of AI:

  • Prioritize Energy-Efficient Hardware: Choose hardware optimized for low energy consumption, such as specialized AI processors.
  • select Green Data Centers: Prioritize data centers that use clean, renewable energy sources.
  • Optimize Model Usage: Evaluate model needs and select the most efficient solution for the task.
  • Assess Model Complexity: Choose simpler models when possible to minimize computational demands.
  • Embrace Responsible AI Practices: Encourage ethical and sustainable AI development.

You may also like

Leave a Comment