Leveraging Virtualization's Business Value

 
 
By Tony Kontzer  |  Posted 2012-01-17
 
 
 

Look at how far virtualization has come. From its earliest roots as a way to squeeze more out of expensive mainframe computers, followed by its widespread adoption as a tool for consolidating overtaxed data centers, virtualization technology has morphed from a mere enabler of efficiency into a fast-emerging bottom-line contributor.

Companies from various industries and government agencies have been unlocking virtualization’s potential to transform the way IT capabilities are delivered. As a result, most IT executives no longer have to justify the benefits of virtualizing their computing assets.

“This is solidly mainstream technology at this point,” says John Burke, principal research analyst with Nemertes Research. “It is, in most companies, the default deployment option: You have to make a case for why not to virtualize.”

Such a case certainly won’t be made at Mazda North America, where virtualization has steadily risen to become a key part of the company’s IT strategy. Mazda’s virtualization path started in 2006, when it became clear that constant demand for new online applications had resulted in unmanageable server sprawl, recalls Barry Blakeley, infrastructure architect.

At the time, Mazda relied on a server leasing program that allowed it to refresh its hardware on a regular basis. Every time a server’s lease came up, the app running on it had to be moved to—and tested on—a new physical server before the old server could be swapped out.

“Every year, it became an increasingly laborious process to migrate apps to new hardware for the lease refresh,” says Blakeley. “Virtualization was a way to alleviate the problem.”

It also proved to be the tip of a giant iceberg, as virtualization took off at Mazda. First, the company chose VMware’s vSphere as its virtualization platform, and then updated to Dell PowerEdge servers, which could run VMware’s
operating system and are designed for virtual environments.

Whereas Mazda used to have 200 physical servers running a similar number of apps, Blakeley says the company now runs 490 virtual machines on just 28 host servers, with most using about 5 percent of the CPU’s capacity and between 44 and 85 percent of available memory.

All that server virtualization eventually introduced a storage issue, as Mazda’s arrays grew larger to support more application data—improving capacity but not performance. With the company eyeing virtualization of its mission-critical SAP financial system, Blakeley knew the company needed a virtualized storage area network to support its ambitions, as well as beefed-up network bandwidth to keep up with the increased flow of data.

The result was a network upgrade from a 1GB connection to 10GB, followed by the deployment of Dell Compellent, a “fluid-data” virtual storage solution that enables Mazda to automatically store data on any of three tiers based on predefined storage profiles. Those improvements spurred a desktop virtualization initiative that’s just getting under way.

Mazda’s server and storage virtualization projects have already delivered substantial value. The company has cut its annual spending on physical servers by 60 percent; reduced the time required for backup processes to six hours from 16; enabled IT to take a complete system snapshot, including databases, in just 30 seconds; and seen storage system performance gains of up to 400 percent. All the improvements led to the one thing companies want most from IT operations: “We’re able to respond quicker to business needs,” says Blakeley.

One of the most valuable lessons from Mazda’s decision to keep building on its virtualization successes is that, to an extent, the more you invest in virtualization, the bigger the return on investment. When Nemertes did an analysis of virtualization ROI in 2009, it found that companies virtualizing 15 to 50 physical servers could expect a three-year return of 166 percent; those virtualizing 50 to 150 units were likely to get a 350 percent return; and those that virtualized
more than 150 servers could anticipate a 500 percent return.

Hampered by antiquated infrastructure

That’s good news for the California Department of Water Resources, which in 2009 was increasingly hampered by an antiquated collection of IT infrastructure components, including 600 servers that no longer met its needs. The DWR, which delivers water to 25 million residents and 750,000 acres of farmland, has over the past several years segmented into multiple business units supporting areas such as water delivery, irrigation management, flood control and energy production.

CIO Tim Garza says the DWR’s aging IT equipment was preventing the restructured department from supporting those changes with effective communication capabilities and was making it a challenge to roll out new business solutions. Also, the storage environment lacked the ability to nimbly handle the ever-growing quantities of spatial and analytical data the department relies on to conduct business.

“IT was becoming a bit of a constraint instead of an enabler of the business,” says Garza. In fact, he had to turn to outside resources—such as the University of California, San Diego—for the computing resources scientists required to perform high-end modeling of water flows, soil erosion and the like. But relying on third-party high-performance computing capacity wasn’t a sustainable model.

“It didn’t provide the flexibility for meeting project deadlines,” says Garza. “We had to queue up behind everyone else, so we weren’t in control of meeting our own demands.”

After performing a detailed assessment of the risks and constraints the department’s aging IT environment presented, the DWR decided on an IT modernization effort built around virtualization. The effort called for deployment of x86-based  HP ProLiant blade servers running on VMware’s vSphere virtualization platform. Over an 18-month period, the department replaced the 125 racks that housed its 600 physical machines with just four racks housing 160 virtual hosts that can run up  to 4,800 virtual machines.

The new infrastructure has paid off, says Garza. Not only has it shortened the time it takes to provision computing capabilities to five days from as many as 60, it has also enabled the DWR to segment the computer needs of its various lines of business, while managing IT assets horizontally across the department.

The department’s carbon footprint has been reduced as well, thanks to a 50 percent improvement in data center cooling efficiency and a 40 percent gain in power distribution efficiency.

What’s more, the new virtualized IT environment has enabled the department to eliminate a $2.2 million maintenance fee associated with the previous infrastructure’s support of its SAP ERP system.

The DWR’s virtualized infrastructure has become the foundation of a shared services IT environment that’s used by other departments that fall under the umbrella of DWR’s parent, the California Natural Resource Agency. For example, Garza says that access to DWR’s virtualized IT resources has allowed the state’s Department of Parks and Recreation to focus on the business of keeping parks open rather than managing its own IT environment.

title=The Backup Challenge} 

The Backup Challenge

For Chris Pinckney, CIO of Los Angeles-based civil engineering firm Psomas, the main challenge was the absence of a modern backup system. As recently as 2010, Psomas—which already had established a highly virtualized and largely cloud-based IT environment—was relying on tape for its only data backup.

Not only was tape expensive to access when Psomas needed to get at older data, but because the company relied on tapes as its de facto disaster recovery system, Pinckney was concerned that he would find himself at the mercy of his tape backup provider.

“If there were a natural disaster, [the service provider] could get hundreds of recall requests,” he says. “We never knew where we fit on that priority list.”

Psomas had long used Riverbed Technology’s Steelhead WAN product to eliminate the unnecessary duplication of unchanged data when employees collaborated on huge CAD files in the firm’s cloud-based applications. So, when Pinckney learned that this deduplication capability was  available in Riverbed’s White-water cloud storage gateway, he immediately started testing it as a replacement for tape.

The product complemented the company’s virtualized server environment, which was running five applications on a single virtualized server in each of Psomas’ 10 field offices. Pinckney estimates that by transmitting only changed data, Whitewater transmits just 30 percent of whatever data is marked for backup to Psomas’ Amazon S3 cloud storage environment, thus reducing the company’s monthly pay-as-you-go storage bill.

By installing a single White-water appliance at the firm’s most data-intensive office and virtual instances in the remaining locations, Pinckney can  now deliver a cloud-based data backup and disaster recovery environment that’s given him control.

Not only has the storage setup delivered anecdotal business impact—“restores are wicked fast compared to tape,” says Pinckney—but Psomas expects to achieve full return on its investment in Whitewater within 18 months from its deployment in March 2011.

It’s clear from the experiences of organizations such as Mazda, the California Department of Water Resources and Psomas that today’s virtualization deployments are about much more than efficiency. They’re about providing the kind of agility and availability that business demands in the 21st century.

These deployments are about getting IT away from the business of managing hardware and software and instead delivering bottom-line business value. Most importantly, they’re about letting companies do things they’ve never done before.