Data Center Power Consumption on the Rise, Report Shows

The amount of electricity used to power the world’s data center servers doubled in a five-year span due mainly to an increase in demand for Internet services, such as music and video downloads, and telephony, according to a new report.

If current trends continue, the amount of power to run the world’s data center servers could increase by an additional 40 percent by 2010, said Jonathan Koomey, a staff scientist at Lawrence Berkeley National Laboratory in Berkeley, Calif., and a consulting professor at Stanford University.

Koomey’s report, funded by Advanced Micro Devices, the Sunnyvale, Calif., chip maker, is being presented at the at the LinuxWorld OpenSolutions Summit in New York City on Feb. 15.

Between 2000 and 2005, according to Koomey’s research, the average amount of power used to fuel servers within the data center doubled. In the United States, that represented a 14 percent annual growth in electrical use, while worldwide use increased by about 16 percent every year.

In 2005, the electrical bills for U.S. companies totaled $2.7 billion. The cost of electricity for the entire world topped $7 billion. Within the United States, the total cost of powering data center servers represented about 0.6 percent of total electrical use within the country. When the additional costs of cooling and other usage is factored in, that number jumps to 1.2 percent.

Read the full story on eWEEK.com: Data Center Power Consumption on the Rise, Report Shows.