For Servers, a Disappearing Act

 
 
By Brian P. Watson  |  Posted 2007-04-11
 
 
 
Back when mainframes ruled, virtualization was a given. It's taken some time, though, for the technology to break into the server market. Now, it's here--but not exactly in all its glory.

Adoption and interest have skyrocketed in recent years, with companies seeing significant cuts in hardware costs and deployment times. Now, long-standing market leaders face competition from upstart open-source vendors while trying to keep their own customers happy.

Lead Story: For Servers, a Disappearing Act
Less hardware is good, but immature technology isn't. With new products on the way, companies could find better ways to speed up virtual server deployments and cut costs.

Vendor Profiles:

Microsoft: Ramping Up
VMware: Speeding Ahead
XenSource: Aiming High

ALSO:

Project Pointers

Expert sources recount their experiences--and tell how you can navigate your own project.

QUESTION:What's the biggest problem with server virtualization software today? Write to us at editors@ziffdavis.com

Next page: Virtualization Takes Hold



Decades ago, programmers at IBM made virtual environments part of the mainframe operating system. Virtualization didn't crack the x86 server world, though, until the late 1990s—and it's been evolving ever since.

Server virtualization—creating multiple virtual machines, running various operating systems, on a single physical box—is booming. According to a February survey by Forrester Research, 40% of North American companies were virtualizing servers in 2006, up from 29% the previous year.

Benefits abound: Companies large and small have sped up server setup times and increased space in their data centers or server rooms. And the advantages were clear even before the software that enables server virtualization had matured.

Take Subaru of Indiana Automotive. In late 2003, the Japanese car maker's only U.S. plant was running out of server space, says Jamey Vester, a Subaru production control professional staff member.

Vester's answer was virtualization, which would allow him to multiply existing servers without adding new hardware. But back then, there were few options: VMware's offerings were becoming increasingly popular, Microsoft had just released Virtual Server and open-source Xen wasn't yet in circulation.

In 2004, Vester, like many early adopters, went with VMware's ESX Server, the first virtualization software with a hypervisor platform, where the software runs directly on top of the server hardware, controlling the operating system.

He, like most, was enticed by how the product sped up server deployment time. Provisioning servers, in Vester's world, was a six- to eight-week struggle with procurement, finance and vendors. With VMware, he and his team can create an image of the primary server and replicate that into a virtual machine.

That winnowed the process down to, at most, one business day; all he had to do was make a copy of an existing virtual machine and install it on the server using the VMware software. "With physical servers, you have to go through your whole purchasing bureaucracy," Vester says. "With virtual, you just write up the image and copy it over."

And virtualization cut the clutter in Subaru's server room. Back in 2003, Subaru ran 10 racks with 50 physical servers. Today, it has knocked that down to three racks and 17 boxes—running a total of 60 virtual machines.



Next page: Comparison Shopping

Comparison Shopping

VMware leads what is still a relatively small market in terms of competitors. In the Forrester report, companies cited VMware, at 53%, as their vendor of choice, with Microsoft taking 9%. The top hardware vendors—Hewlett-Packard, IBM and Dell—got write-in votes totaling 28% (none of the three offer an exclusive virtualization software tool). Six percent didn't know, and the remaining 4% mentioned "others." Of those, Forrester says, only one mentioned Xen.

The margin increased when companies interested in virtualization were asked which vendor they'd lean toward: VMware took 60%, with Microsoft garnering only 7%.

But Microsoft is hoping to change that with its next product offering. "In the past, virtualization has been part of the operating system," says Jim Ni, group product manager in Microsoft's Windows server division. The goal with the Redmond, Wash., firm's next release, Longhorn, slated for the second half of this year, is to do just that.

Dave Chacon, manager of information systems technology services for PING, the golf club and equipment maker, says vendors should think of virtualization as a fundamental piece of the operating system—"not just an add-on"—as IBM did with mainframes.

And he sees an opening for Microsoft, since Windows dominates the operating system market. That, combined with pricing, led Chacon to Virtual Server.

In 2004, PING looked for a low-cost way to virtualize its Windows environments. Already a Windows shop, the equipment maker went with Microsoft. Chacon looked at VMware but says the software cost at least 10 times more.

He also considered blade servers—thinner boxes that take up less rack space than x86 servers—but adding hardware costs was a deal-breaker. To install and deploy more physical servers, Chacon says he would have had to fly in consultants, build a test environment and then make a decision based on the results.

Going with virtualization avoided thousands of dollars in consulting fees in the planning stages alone. Still, Chacon wishes Virtual Server were built in to the operating system. If it were, managing virtual machines and the operating system would be one and the same. Instead, he has to manage them separately. Ni says Longhorn will move virtualization closer to being embedded in Windows.

Still, Chacon believes managing virtual machines—whether one or 100—should be an out-of-the-box attribute of any platform. "There's still a whole level of maturity [of products] that needs to happen in the marketplace in general," he says.

Next page: Third Option

Third Option

Price was one differentiating factor between the two leading vendors, but both VMware and Microsoft offer some free software. VMware made its Server product free in February 2006, and Microsoft did so with Virtual Server 2005 two months later. But VMware's hypervisor-based ESX Server comes with a cost—ranging from $1,000 to $5,750 per two processors—and Microsoft hasn't decided the price for its next release.

For the cost-conscious, another option exists: open-source software, known as Xen, offered in enterprise versions by vendors like XenSource and Virtual Iron.

Despite VMware and Microsoft's market lead, Xen has its followers. Early last year, Jason's Deli, a chain of eateries in 21 states, was using VMware to deploy virtual servers. At the time, enterprise application developer Neal Cowles was evaluating e-mail servers but didn't have the resources to devote an entire server to one piece of software. With virtualization, he created separate environments for the test, all on one physical machine.

He saw an improvement in performance overhead, or fewer delays in the software: With XenSource's XenEnterprise, there were 5% delays, compared with 10% with VMware. (VMware, however, disputes the finding, claiming Xen hypervisors have twice the overhead of ESX Server.) Cowles went with XenSource, while still running some VMware, but also looked at Virtual Iron.

When it comes to cost of hypervisor-based products, Cowles says there's no comparison with open source. Virtual Iron's Enterprise Edition costs $499 per socket. XenSource charges just under $1,400 per processor, including licensing and maintenance. (VMware, for its part, says it offers more features for the price.)

"It's a complicated decision," Cowles says, "but I would recommend Virtual Iron or XenEnterprise over [VMware] from a price perspective."



Next page: Project Pointers

Project Pointers

Embarking on a virtualization project can be tricky. Where to start? Baseline asked some experts to recount the lessons they learned in virtualizing servers—and how those experiences can help you.

PLAN

Cutting costs is great, but make sure you have ample hard drive space and memory. Take it from Karl Fisher, systems engineer with All Systems Integration of Woburn, Mass. He says his systems integration firm underestimated how many gigabytes of space it would need for its virtual machines as well as how much memory some applications required. Getting up to par didn't cost much, Fisher says, but it took some time. In light of that, doubling your expected capacity is a wise move.

WEIGH OPTIONS

Instead of mixing and matching, get the right recipe. Brian Heagney, data center manager with CoAMS, a trade promotion consulting firm, urges his fellow technologists to research which hardware works with the different virtualization engines and their operating systems. "The same hardware will scale differently [with different software] and that can determine the up-front cost savings," Heagney says. After all, if everything worked together perfectly, there wouldn't be any competition.

EVALUATE

Read the brochures, listen to the sales pitches—but nothing beats taking the product for a spin. "The best way to evaluate is to have the vendors set up test systems and run similar virtual machines on the competing products," says Tim Smith, director of humanities information systems at The Ohio State University. Testing VMware's ESX Server helped Smith get comfortable with—among other things—the software's P2V provisioning capability, which moves information from physical boxes onto new virtual machines.