Not long ago, as the Internet, mobile phones and the rest of the connected computing universe hit the elbow of its growth curve, experts and the public alike began to worry about the amount of power the sprawling data centers would require. In 2007, the EPA projected that energy needs for data centers would double from 2005 to 2010 and cost an annual $7.4 billion. Yet in a study prepared by Jonathan Koomey of Stanford University at the request of the New York Times, that amount has only risen by about 56 percent.
The United States power consumption only increased by 36 percent in that same time period. Mr. Koomey’s findings estimate that electricity used by data centers in the US now accounts for between 1.7 and 2.2 percent of overall power consumption.
The causes of the lower-than-projected totals are still unclear. Mr. Koomey chalks up some of it to the recession and basic infrastructure and efficiency improvements made at data facilities, though that is based on his opinion, not researched findings. In that same time period, the computing market has seen a massive shift to cloud environments, like email, music and video servers so there is likely more digging to do.
Google, an obvious choice for using the most energy of any Internet company, reported a surprisingly low total energy usage to Mr. Koomey, claiming to use only about one percent of his projected usage for all data centers. Google has long been secretive about its custom-built data centers, infrastructures and energy usage.
Though Mr. Koomey’s findings indicate an overestimation on the EPA’s part, it’s not to say that data centers are not increasing energy consumption at an incredible rate. A 56 percent increase over five years is still quite significant.