The computer engine rooms that power the digital economy have become surprisingly energy efficient.
A new study of data centers globally found that while their computing output jumped sixfold from 2010 to 2018, their energy consumption rose only 6 percent. The scientists’ findings suggest concerns that the rise of mammoth data centers would generate a surge in electricity demand and pollution have been greatly overstated.
The major force behind the improving efficiency is the shift to cloud computing. In the cloud model, businesses and individuals consume computing over the internet as services, from raw calculation and data storage to search and social networks.
The largest cloud data centers, sometimes the size of football fields, are owned and operated by big tech companies like Google, Microsoft, Amazon and Facebook.
Each of these sprawling digital factories, housing hundreds of thousands of computers, rack upon rack, is an energy-hungry behemoth. Some have been built near the Arctic for natural cooling and others beside huge hydroelectric plants in the Pacific Northwest.
Still, they are the standard setters in terms of the amount of electricity needed for a computing task. “The public thinks these massive data centers are energy bad guys,” said Eric Masanet, the lead author of the study. “But those data centers are the most efficient in the world.”
The study findings were published on Thursday in an article in the journal Science. It was a collaboration of five scientists at Northwestern University, the Lawrence Berkeley National Laboratory and an independent research firm. The project was funded by the Department of Energy and by a grant from a Northwestern alumnus who is an environmental philanthropist.
The new research is a stark contrast to often-cited predictions that energy consumption in the world’s data centers is on a runaway path, perhaps set to triple or more over the next decade. Those worrying projections, the study authors say, are simplistic extrapolations and what-if scenarios that focus mainly on the rising demand for data center computing.
By contrast, the new research is a bottom-up analysis that compiles information on data center processors, storage, software, networking and cooling from a range of sources to estimate actual electricity use. Enormous efficiency improvements, they conclude, have allowed computing output to increase sharply while power consumption has been essentially flat.
“We’re hopeful that this research will reset people’s intuitions about data centers and energy use,” said Jonathan Koomey, a former scientist at the Berkeley lab who is an independent researcher.
Over the years, data center electricity consumption has been a story of economic incentives and technology advances combining to tackle a problem.
From 2000 to 2005, energy use in computer centers doubled. In 2007, the Environmental Protection Agency forecast another doubling of power consumed by data centers from 2005 to 2010.
In 2011, at the request of The New York Times, Mr. Koomey made an assessment of how much data center electricity consumption actually did increase between 2005 and 2010. He estimated the global increase at 56 percent, far less than previously expected. The recession after the 2008 financial crisis played a role, but so did gains in efficiency. The new study, with added data, lowered that 2005 to 2010 estimate further.
But the big improvements have come in recent years. Since 2010, the study authors write in Science, “the data center landscape has changed dramatically.”
The tectonic shift has been to the cloud. In 2010, the researchers estimated that 79 percent of data center computing was done in smaller traditional computer centers, largely owned and run by non-tech companies. By 2018, 89 percent of data center computing took place in larger, utility-style cloud data centers.
The big cloud data centers use tailored chips, high-density storage, so-called virtual-machine software, ultrafast networking and customized airflow systems — all to increase computing firepower with the least electricity.
“The big tech companies eke out every bit of efficiency for every dollar they spend,” said Mr. Masanet, who left Northwestern last month to join the faculty of the University of California, Santa Barbara.
Google is at the forefront. Its data centers on average generate seven times more computing power than they did just five years ago, using no more electricity, according to Urs Hölzle, a senior vice president who oversees Google’s data center technology.
In 2018, data centers consumed about 1 percent of the world’s electricity output. That is the energy-consumption equivalent of 17 million American households, a sizable amount of energy use — but barely growing.
The trend of efficiency gains largely offsetting rising demand should hold for three or four years, the researchers conclude. But beyond a few years, they say, the outlook is uncertain.
In the Science article, they recommend steps including more investment in energy-saving research and improved measurement and information sharing by data center operators worldwide.
The next few years, they write, will be “a critical transition phase to ensure a low-carbon and energy-efficient future.”