Beyond The

 
 
By David F. Carr  |  Posted 2007-08-10
 
 
 

Last summer, Cimarex Energy encountered a crisis in its Tulsa, Okla., data center. Its data systems, as well as the seismic and geologic data critical to its oil and gas exploration and production business, were at risk due to overheating equipment.

The server room was at capacity: There was simply no space available for equipment. And the company learned the hard way that it had packed in too much equipment for its power and cooling systems. The densely packed micro-circuitry was overheating servers and other I.T. devices.

It's no wonder the data center ran into problems.

Exhaust from one device was sucked into the air intake of others nearby. Equipment—including Network Applicance enterprise storage devices and Hewlett-Packard servers—was failing despite running the air conditioning at full throttle.

"NetApp tech support was replacing two or three drives a week in the array due to failure, and were telling us just as often that if we didn't do something with the heat issues we were having, that they were going to discontinue support for these devices," explains Cimarex network engineer Rodney McPhearson, who was charged with finding a solution to the problem. "We were seeing temperature logs running from 74 to 76 degrees on the front side of the rack row, and above 100 degrees behind the racks. This was causing drive failures in our HP servers also."

The weekly equipment failures were getting costly, and McPhearson knew it was time for a new approach to keeping equipment cool as the data center expanded into a new server room.Glass Room'">

Beyond The 'Glass Room'

The traditional approach to data-center cooling, going back to the days of mainframe "glass rooms," has been to provide good ventilation and a powerful central air conditioning unit. But the trend toward packing more computing power into each I.T. device and more devices into each rack means it's not always enough to rely on the circulation of chilled air through the room. Gartner recently predicted that by 2011, the predominant strategy for high-density computing will be to use cooling equipment that's built into each row of server racks or installed in the racks themselves.

So, where the old Cimarex computer room relied on an 10-ton Liebert air conditioner, the new one wound up being built around American Power Conversion's InfraStruXure product, a server rack system that features integrated in-row cooling, along with APC's battery backup technology. Cimarex also chose to take advantage of APC's Hot Aisle Containment system—an arrangement where two back-to-back rows of server racks vent their exhaust into an enclosed area with a roof and doors on either end. This keeps the hot air out of circulation until it can be cooled and vented out the front of the server rack.

"Rather than cool the whole room, we trap the heat into this area and only remove the heat from that section of the room," McPhearson explains.

Best known for its power protection technology, APC has spent the past several years positioning itself as a vendor that can also help data centers address their cooling and energy efficiency issues. The West Kingston, R.I., company was acquired in February by Schneider Electric, a global power equipment manufacturer based in Paris, and merged with Schneider's power protection subsidiary, MGE. At a media briefing in June, APC chief technology officer Neil Rasmussen said the firm will continue to focus on improved cooling as one of the best ways to enhance data-center energy efficiency and reliability.

In-row cooling can be more efficient because the cool air can be delivered closer to the equipment to be protected, Rasmussen says. In the traditional approach of cooling the whole room, the air coming out of the air conditioning vents needs to be made that much cooler because it's not being delivered with the same precision. "So, you end up with 45-degree air coming out to the floor," he says, even though most I.T. equipment doesn't need to be kept anywhere near that cool. "And it's a lot more expensive to make 45-degree air than 70-degree air." Also, while APC's power protection equipment is rated about 97% energy efficient, making further gains hard to come by, the company believes there's potential to improve the efficiency of data-center cooling by another 20% to 30%, Rasmussen says.Conditioner">

Dumping The Old Air Conditioner

Although other vendors including Liebert (a subsidiary of Emerson Network Power) also make in-row cooling equipment, McPhearson first learned about the approach after seeing a demonstration of the InfraStruXure equipment in the back of a truck that APC had taken on a promotional tour. Initially, he found it difficult to convince senior management that the compact APC cooling equipment would be able to succeed where the old air conditioner had failed. "They couldn't get their heads around the idea that these two APC units, which fit in about 8 square feet, could do the same job as our 10-ton Liebert covering 25 to 30 square feet of floor space," McPhearson says.

After visiting an APC facility in St. Louis, however, the Cimarex managers were convinced to give it a try. While continuing to run the Liebert air conditioner, McPhearson began migrating servers and network equipment into the APC racks a piece or two at a time. "Pretty soon, we wound up migrating all of our stuff from the old server room to the new server room," he says. "The only thing we have left in the old server room right now is those two Network Appliance boxes." By the end of the year, he plans to move those over as well and retire the old 10-ton air conditioner.

Although the move to in-row cooling was originally driven by a crisis with overheating equipment, it also had a financial payoff, McPhearson says: "We weren't trying to drive down electric use and cost, but those savings have in fact materialized as an unintended benefit."

McPhearson and his staff had to learn how to calibrate the equipment properly. At first, they had a tendency to adjust the "set point" at which the air conditioning would kick in too low, at about 68 degrees. That seemed reasonable to a staff used to a traditional data center cooled to 50 or 60 degrees in an effort to counteract the hot air circulating through the room. Gradually, they learned that they could adjust the set points between 72 and 75 degrees and still keep the temperature within the server racks to an acceptable level. "That's what saves you the cooled water," McPhearson says.

Another benefit of the APC rack and cooling design: Cimarex could pack more equipment into less space, without causing overheating issues. The effect was multiplied by a virtualization effort Cimarex was pursuing at the same time, using VMware technology to consolidate the workload of 38 or 39 physical servers onto two servers hosting multiple virtual machines.

Andrew Terminesi, the APC account manager who worked with Cimarex, notes that the cost savings numbers tend to look more impressive for much larger installations. Even though Cimarex is a billion-dollar company, the Tulsa data center is really a modest-size computer room equipped with just eight InfraStruXure racks. "The savings in floor space is one of the major benefits they realized from this kind of installation, since traditional data-center racks and cooling are very space inefficient," Terminesi says.

The last thing McPhearson says the arrangement provides is peace of mind. Whenever the old cooling system failed—and it happened twice in the year prior to the APC implementation—the server room would heat to 118 degrees within 15 minutes, and servers had to be shut down manually to avoid permanent damage, he says: "This is why I am so fond of the new system all being tied together; if we lose chilled water, the system powers down gracefully before we see heat issues." : Cimarex Energy">

At a Glance: Cimarex Energy

Headquarters:
1700 Lincoln St., Suite 1800, Denver, CO 80203

Phone:
(303) 295-3995

URL:
http://www.cimarex.com

Business:
Oil and gas exploration and production.

Financials in 2006:
$1.3 billion in revenue; $345.7 million in profit.

Challenge:
Find a more efficient way of cooling an overcrowded data center that was suffering failures of overheated equipment on a weekly basis.

and More Efficient Data Center">

New server racks with in-row cooling allowed Cimarex to pack more of its data-center equipment into a smaller space, with better management of excess heat and reduced energy consumption.

The reduction in the energy required for the new setup is reflected in the meter readings below, which measure chilled-water consumption for the two computer rooms. The old 400-square-foot server room was overcrowded and overheating before Cimarex expanded into a second, 235-square-foot server room. Yet the new room now houses most of the data center's equipment (about 85 servers) without suffering the same overheating problems. Even with just three Network Appliance storage servers remaining in the old server room, the whole-room air conditioning unit installed there continues to consume about three times the energy (as measured by meter readings for the chilled water supplied to the air conditioning equipment). The difference is that the whole-room unit consumes chilled water at essentially a constant rate, while the in-row chilling system uses it only as necessary and delivers it more efficiently to where it is needed.

Usage, in Ton-Hours
2007 Old Server Room: Traditional Cooling New Server Room: In-Row Cooling Change
January 45,639 14,561 -68.1%
February 86,154 29,483 -65.8%
March 51,763 17,425 -66.3%
April 64,080 21,295 -66.8%
May 86,523 19,189 -77.8%
June 92,691 36,497 -60.6%