Science

Scientists Make AI Tool That Cuts Emissions 45% and Extends Server Life by 20 Months

Researchers at the University of California, Riverside (UCR) have developed an artificial-intelligence-based routing tool that in tests reduced emissions in data-centre operations by as much as 45% while extending server lifespan by approximately 1.6 years.

The new system optimises the way compute-heavy tasks flow through data-centre infrastructure. Rather than simply distributing workloads evenly, the AI monitors hardware conditions, energy-source profiles and cooling system performance in real time. It then routes computing tasks through the most efficient path, minimising power draw and heat build-up while maintaining high performance. This dual benefit of lower emissions and longer-lasting servers addresses two major cost and sustainability pain points for large-scale computing.

“Our results show that sustainability in AI cannot be achieved by focusing on clean energy alone,” said Mihrimah Ozkan. “AI systems age, they heat up, and their efficiency changes over time—and these shifts have a measurable carbon cost.”

“By integrating real-time hardware health with carbon-intensity data, our framework learns how to route AI workloads in a way that cuts emissions while protecting the long-term reliability of the machines themselves,” Ozkan said.

Data centres today represent a rapidly growing slice of global energy consumption. According to the International Energy Agency, emissions from computing infrastructure could soar unless offset by efficiency gains. Meanwhile, other research forecasts that AI solutions could cut up to 5.4 billion tonnes of CO₂-equivalent per year by 2035 if deployed at scale.

The UCR tool exploits periods of lower-carbon electricity, light-duty workloads and cooler ambient conditions to delay wear on critical components. In doing so, it effectively turns idle capacity into an asset by reducing the stress on active servers. Extending server lifespan by 1.6 years adds up to material savings in hardware-replacement cost and dematerialisation of compute resources.

“We reduce operational emissions in real time, but we also slow down hardware degradation,” said Cengiz Ozkan, the other co-author and researcher. “By preventing unnecessary wear, we reduce not only the energy used today but also the environmental footprint of tomorrow’s hardware production.”

For data-centre operators, the take-aways are clear: adopting dynamic AI-driven routing could yield three-digit percentage improvements in energy efficiency, meaning lower bills, fewer cooling demands and reduced emissions. At the same time, the improved longevity of hardware helps convert what has been a large latent cost into deferred replacement cycles.

This approach shifts some of the burden from requiring entirely new renewable power supplies to making existing infrastructure smarter.

The next phase involves moving from controlled tests to real-world deployments at scale. UCR researchers say the system is ready for pilot use in commercial data centres this year. Key questions to monitor include: how well the gains translate outside lab conditions, how adaptable the AI is across different hardware stacks and how it responds when energy grids move from fossil-heavy to renewable-heavy sources.

The study is published in the journal MRS Energy and Sustainability.