Server Virtualization Is On Nationwide's Side

By Doug Bartholomew Print this article Print

The insurance firm was outgrowing its data center, but wanted to wait before building another one. Server virtualization boosted the efficiency of its existing facility.

Nationwide, a Columbus, Ohio-based diversified financial services and insurance firm with 20 data centers and a $250 million budget for information-technology infrastructure, figured it would outrun the power capacity of its primary data center in Columbus by 2013. One alternative, of course, was to build a new facility with greater floor space and electrical capacity as well as increased computing and data storage.

Instead, Nationwide took a different tack. By reducing data-center floor space needs and reducing power usage, the insurance giant has been able to extend the life of its existing main data center by at least two years.

That's no small accomplishment for an I.T. organization that projects 5% growth in processing year-to-year for the foreseeable future. Each month the company processes about 400 million transactions for such things as calculating policy quotes; making policy additions, changes and deletions; and processing claims for auto, property, boat, recreational-vehicle, home, life and other policies.

"The cost to the organization of building another data center is [in the] hundreds of millions of dollars," says Scott Miggo, vice president of Technology Solutions at Nationwide Services Co., the company's shared services unit. By finding ways to extend the life of its main data center and postponing a huge capital outlay, Miggo, who is responsible for all I.T. infrastructure including data storage, servers, desktops and mainframes, is helping the company save money today.

Nationwide was able to forestall construction of a new data center by investing in a $30 million upgrade of its existing center in Columbus to a Tier 4 facility-the most redundant and highly available as defined by the Uptime Institute, a consortium of companies concerned about infrastructure availability. The rating applies to the physical infrastructure of the building, including the ability of its power, cooling and other equipment to withstand natural disasters. The data center was a Tier 3 before, and the upgrades were largely aimed at improving power and cooling redundancy, Miggo explains.

Miggo offers an example of the extent of the upgrade to the building. Prior to the recent modifications, for every square foot of raised floor (computer space) there was an equal amount of square footage dedicated to mechanical infrastructure for power, cooling and backup equipment. "We now have nearly 3-to-1 mechanical to raised floor," he says.

A key piece of the upgrade was a boost of the facility's power capacity. "Three years ago when we embarked on this project, we calculated that we would not have enough power by 2013," he says. "And we had already started to run out of space."

Miggo points out that the data-center upgrade, rather than aimed at saving money, was an effort to improve the company's "risk posture" and reduce the chance of system downtime by building in extra protection.

An important piece of Nationwide's solution to the space and power problem was a sweeping server virtualization program begun 2 years ago. The company uses software from VMware that enables numerous applications and operating systems to run simultaneously on one server. Thus, a single server can do the processing jobs of several.

"We've had a reduction of 80% of our floor space," Miggo reports, offering an example of how virtualization provides leverage in processing. "By virtualizing an older, larger server, we've been able to get 20 virtual servers on that one, eliminating 19 physical boxes. This reduces our hardware and operating system costs."

The reduction in floor space is partially offset by a growth in mainframe, storage, network and other systems.

Although space is no longer an issue, he says, ultimately Nationwide will need a new facility because the present building's infrastructure can no longer be expanded to supply enough power and cooling.

"We will actually have enough physical space to add more systems in for a long time," Miggo explains. "But we will not have the physical building infrastructure to supply enough power and cooling to the building." That's because the building cannot be upgraded further to handle the additional power and cooling without major investment, which has led Miggo and his team to look at building a secondary data center as a better option.

Apart from the virtualization approach, Miggo found that replacing old energy-intensive servers with new ones resulted in reduced energy costs, in addition to saving space.

"The new technologies are a lot more green than the old ones," he says. "We took 200 Sun servers that were 4 to 7 years old and replaced them 1 to 1 with new ones and found we saved on floor space, as well as power and maintenance costs. With the money we saved on space, maintenance and power, it paid for itself."

By replacing older Sun Microsystems machines with the newer, energy-efficient Niagara models, the company saves $40,000 per year on these machines alone, Miggo says.

In another space-saving move, Miggo swapped out the existing storage tape silos with denser tape and faster tape robots. While data storage isn't typically the first thing on every CIO's mind when it comes to cutting space costs, Miggo figures that every little bit of hardware or software that can be modernized or improved upon helps. "You've got to look at it holistically," he explains. "We are looking at going to a massive array of idle disks that shut down and are brought up only when you need the data on a particular disk."

The biggest impact by far came from the CIO's campaign to implement virtual servers. Since embarking on a massive virtualization effort, Nationwide has reduced the number of servers from 5,000 to 3,500. "We now have 1,500 virtual instances running on 100 servers," he says. One physical virtual server runs multiple virtual instances, each of which would have been a single physical server in the old environment.

The result is that the average server utilization has gone way up, from below 10% to 65%. "From an I.T. management standpoint, you do not want a bunch of small servers running at 10% utilization," Miggo points out. "As a result, we are spending a lot more time on capacity planning."

Despite boosting serv-er utilization to 65%, the amount of heat expended by the chips in each machine is "not really measurable," Miggo says. Nor does he expect server utilization to ever reach 100%, citing peaks and spikes in loads.

Next page: Looking Ahead

This article was originally published on 2007-08-24
Doug Bartholomew is a career journalist who has covered information technology for more than 15 years. A former senior editor at IndustryWeek and InformationWeek, his freelance features have appeared in New York magazine and the Los Angeles Times Magazine. He has a B.S. in Journalism from Northwestern University.
eWeek eWeek

Have the latest technology news and resources emailed to you everyday.