The computer engine rooms that power the digital economy have become surprisingly energy efficient. A new study of data centers globally found that while their computing output jumped sixfold from 2010 to 2018, their energy consumption rose only 6 percent. The scientists’ findings suggest concerns that the rise of mammoth data centers would generate a surge in electricity demand and pollution have been greatly overstated.
The major force behind the improving efficiency is the shift to cloud computing. In the cloud model, businesses and individuals consume computing over the internet as services, from raw calculation and data storage to search and social networks.
The largest cloud data centers, sometimes the size of football fields, are owned and operated by big tech companies like Google, Microsoft, Amazon and Facebook.
Each of these sprawling digital factories, housing hundreds of thousands of computers, rack upon rack, is an energy-hungry behemoth. Some have been built near the Arctic for natural cooling and others beside huge hydroelectric plants in the Pacific Northwest.
Still, they are the standard setters in terms of the amount of electricity needed for a computing task. “The public thinks these massive data centers are energy bad guys,” said Eric Masanet, the lead author of the study. “But those data centers are the most efficient in the world.”
The study findings were published on Thursday in an article in the journal Science. It was a collaboration of five scientists at Northwestern University, the Lawrence Berkeley National Laboratory and an independent research firm. The project was funded by the Department of Energy and by a grant from a Northwestern alumnus who is an environmental philanthropist.
The new research is a stark contrast to often-cited predictions that energy consumption in the world’s data centers is on a runaway path, perhaps set to triple or more over the next decade. Those worrying projections, the study authors say, are simplistic extrapolations and what-if scenarios that focus mainly on the rising demand for data center computing.
By contrast, the new research is a bottom-up analysis that compiles information on data center processors, storage, software, networking and cooling from a range of sources to estimate actual electricity use. Enormous efficiency improvements, they conclude, have allowed computing output to increase sharply while power consumption has been essentially flat.
“We’re hopeful that this research will reset people’s intuitions about data centers and energy use,” said Jonathan Koomey, a former scientist at the Berkeley lab who is an independent researcher.
Over the years, data center electricity consumption has been a story of economic incentives and technology advances combining to tackle a problem.
From 2000 to 2005, energy use in computer centers doubled. In 2007, the Environmental Protection Agency forecast another doubling of power consumed by data centers from 2005 to 2010.In 2011, at the request of The New York Times , Mr. Koomey made an assessment of how much data center electricity consumption actually did increase between 2005 and 2010. He estimated the global increase at 56 percent, far less than previously expected. The recession after […]