Google’s use of machine learning to improve energy efficiency of data centers has a range of implications: In the near term, as the need for data processing increases, the advances can help centers reduce costs, consume less energy and lower emissions.
DeepMind researchers maintain the implications are positive: Google’s data centers are already optimized and among the most efficient, so the double-digit improvements are impressive. “In any large scale energy-consuming environment, this would be a huge improvement. Given how sophisticated Google’s data centres are already, it’s a phenomenal step forward,” DeepMind’s Rich Evans and Google’s Jim Gao wrote in a recent blog post.
Evans and Gao said the project took historical data on temperatures, power, pump speeds and more, and used it “to train an ensemble of deep neural networks.”
“Since our objective was to improve data centre energy efficiency, we trained the neural networks on the average future PUE,” they wrote. “We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data centre over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints.”
This isn’t the only creative approach companies are taking when it comes to cooling data centers and cutting down on energy usage as energy demand for data centers is expected to double in the next five years.