The problem with dire predictions is that they can never account for the unexpected. History is replete with examples of how the inevitable was derailed by some unforeseen, or unplanned, event. The Roman army was unbeatable until they met Hannibal at Cannae. Hitler was going to overrun the Soviets, but then it started snowing outside Stalingrad. Standard Oil was going to be a monopoly of one before some Texans started drilling at a place called Spindletop. And speaking of oil, in 1970 many experts said that we had hit the peak and oil production had no place to go but down; a theory that didn’t anticipate fracking. I bring all these instances up because it looks like we might be experiencing the same type of phenomenon regarding data center energy usage.
For those of you who don’t remember, in 2010 a report by Jonathan Koomey found that data centers use one heck of a lot of energy. (Aside: For those of that don’t know Jonathan, he is the furthest thing from draconian – he’s a scientist – data, not emotion, wins. He is a great researcher and very thorough testing all assumptions.)
His findings reflected what we all believed but no had been able to quantify. So far, so good. Unfortunately, these results also prompted various industry leaders, media outlets like the New York Times, activist groups and the people who never met a regulation they didn’t like, the U.S. government, to perk up their ears and take notice. Not wanting to pass up the opportunity for demagoguery, soothsayers of impending doom began to paint a picture of the future in which the insatiable energy demands of data centers left entire neighborhoods sharing a single lightbulb despite the rabbit-like proliferation of the Voldemort of all energy production methods—coal fired power plants.
Meanwhile, back at the ranch, the rest of us capitalists were busy implementing new schemes to reduce our energy consumption and boost our profits, virtualizing our servers to make use of unused capacity or developing new ones that were more powerful and efficient. You know, continually trying to improve our operations to make a positive contribution to the bottom line. We all just kept focusing on making things better from an energy usage perspective while sitting through a dire presentation or two at a conference and reading an occasional article about how this unabated energy use threatened the planet, but most of us just found these dire prophesies to be part of selling media. Well, it appears we were right.
A recent Department of Energy study determined that although the number of data centers has grown dramatically, energy consumption had only a slight increase in growth. In fact, data center energy usage from 2010 to 2014 (the period of the study) grew at a 4% clip versus 24% for the previous 5-year period. And what do they attribute this flattening of the rate of power usage? If you guessed all the things you’ve been doing to improve efficiency to increase your revenue, you’d be correct. Naturally, the study alludes to the notion that prompted by the efforts of the aforementioned organizations and individuals, we collectively took action to ensure that little Timmy will still be able to play Candy Crush, but if it makes them happy so be it.
As history demonstrates virtually nothing is inevitable, and the findings of this DOE study are just another example of this principle. Our industry is moving so fast, that regulation really threatens the pace of innovation. I get chippy on this subject because it matters. Whether the unforeseen event, or events, occurs due to altruism or just to reduce costs by using less of a resource, it really doesn’t matter. The road to inevitability is a long one, and its final destination is never predetermined. In the world of data centers, and life in general, it’s not what we see, but the things that we don’t that determine the final outcome.
For those of you that wish to revisit my thoughts over the years (back to 2012…) on this particular subject, see the following: