Data Center Power Consumption on the Rise, Report Shows

By Scott Ferguson  |  Posted 2007-02-15 Email Print this article Print

A new study, commissioned by AMD, shows that the amount of electricity to power the world's data center doubled in five years due to the increased demand of services like music and video downloads.

The amount of electricity used to power the world's data center servers doubled in a five-year span due mainly to an increase in demand for Internet services, such as music and video downloads, and telephony, according to a new report.

If current trends continue, the amount of power to run the world's data center servers could increase by an additional 40 percent by 2010, said Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratory in Berkeley, Calif., and a consulting professor at Stanford University.

Koomey's report, funded by Advanced Micro Devices, the Sunnyvale, Calif., chip maker, is being presented at the at the LinuxWorld OpenSolutions Summit in New York City on Feb. 15.

Between 2000 and 2005, according to Koomey's research, the average amount of power used to fuel servers within the data center doubled. In the United States, that represented a 14 percent annual growth in electrical use, while worldwide use increased by about 16 percent every year.

In 2005, the electrical bills for U.S. companies totaled $2.7 billion. The cost of electricity for the entire world topped $7 billion. Within the United States, the total cost of powering data center servers represented about 0.6 percent of total electrical use within the country. When the additional costs of cooling and other usage is factored in, that number jumps to 1.2 percent.

Read the full story on Data Center Power Consumption on the Rise, Report Shows.


Submit a Comment

Loading Comments...
Manage your Newsletters: Login   Register My Newsletters