That’s right: US data centres consume more power than all the TVs in America. This didn’t sound too incredible at first, Greenbang thought, until it struck us that there’s probably about 3.2 TVs per person across the US, or something equally astonishing. According to a new report (or Executive Summary) from the EPA, US data centres chew up 1.5% of all US electricity–or a neat $4.5bn worth of power.
Under current efficiency trends, national energy consumption by servers and data centers could nearly double again in another five years (i.e., by 2011) to more than 100 billion kWh (Figure ES-1), representing a $7.4 billion annual electricity cost. The peak load on the power grid from these servers and data centers is currently estimated to be approximately 7 gigawatts (GW), equivalent to the output of about 15 baseload power plants. If current trends continue, this demand would rise to 12 GW by 2011, which would require an additional 10 power plants.
On the plus side, the report highlights the “significant potential for energy-efficiency improvements” in data centres.
For instance, existing technologies and design strategies have been shown to reduce the energy use of a typical server by 25 percent or more. Even with existing IT equipment, implementing best energy-management practices in existing data centers and consolidating applications from many servers to one server could reduce current data center energy usage by around 20 percent.