Primer: Virtual Servers

By David F. Carr  |  Posted 2006-01-14 Email Print this article Print
 
 
 
 
 
 
 

Virtual servers can improve efficiency in your data center by replacing lots of small servers with fewer, bigger, easier-to-manage machines.

What is a virtual server? Virtual servers make it possible to place multiple applications on a single physical server, yet run each within its own operating system environment, known as a virtual machine. So, when one virtual server crashes or is rebooted, the others continue operating without interruption.

Is it new? Early virtual machines appeared on IBM mainframes back in the 1960s. What's new is the realization that virtualization software for Intel-based servers can now support enterprise applications. Products such as VMware's ESX Server, originally used for development and testing, are finding increasing acceptance in data center environments.

What are other vendors doing? Microsoft Virtual Server 2005, while not yet widely deployed for enterprise-level production applications, represents a first step toward incorporating virtual server support into the Windows server architecture.

While there are hardware-based partitioning schemes in which the processors on a server are divided among virtual machines, software-based systems are attracting the most attention. Software virtual machines have the advantage of flexibility because the fraction of server resources they use is independent from the underlying hardware and can be easily reconfigured. Software-based approaches, though, have higher overhead than hardware approaches because of the layer of software added to supervise and coordinate the guest operating systems.

What are the benefits? The common practice in data centers of assigning each application its own server avoids clashes between programs, such as requirements for different versions of operating system libraries. But it can also result in low utilization of each server. By running many applications on a single server but each within a virtual machine, a data center can achieve more efficient use of server resources. "If they have to reboot their transportation planning server, they don't want to also take down the human-resources server," says Jaime Gmach, president of Evolving Solutions, a Hamel, Minn., consulting firm.

How does it work? The virtual server breaks dependencies between application, operating system and underlying hardware. An older application can be ported to a virtual Windows NT environment that simulates the 1990s hardware on which it was designed to run. "When you virtualize it onto a more modern machine that's faster, you're going to see huge performance gains," says Al Muller, a consultant with Callisma and co-author of Virtualization with VMware ESX Server. Virtualization makes sense for simple applications and network services, such as backup, e-mail, Web and domain name servers. Muller and Gmach say they would hesitate, however, to move a critical enterprise application onto a virtual server.

How do the offerings differ? VMware provides two options, GSX Server and ESX Server. The difference is the virtual machine monitor—the software that controls how many server resources are assigned to each virtual machine. GSX Server uses a full copy of Windows or Linux; ESX Server runs a stripped-down proprietary operating system, known as a hypervisor, to minimize the overhead of running multiple operating systems.

Like GSX Server, Microsoft Virtual Server 2005 is hosted on a full copy of the operating system, here Windows 2003. Another option is the Xen open-source project from England's University of Cambridge. Achieving a similar effect by different means is SWsoft's Virtuozzo, used by many Web hosting firms to parcel out resources to developers who in turn resell that capacity to customers.



 
 
 
 
David F. Carr David F. Carr is the Technology Editor for Baseline Magazine, a Ziff Davis publication focused on information technology and its management, with an emphasis on measurable, bottom-line results. He wrote two of Baseline's cover stories focused on the role of technology in disaster recovery, one focused on the response to the tsunami in Indonesia and another on the City of New Orleans after Hurricane Katrina.David has been the author or co-author of many Baseline Case Dissections on corporate technology successes and failures (such as the role of Kmart's inept supply chain implementation in its decline versus Wal-Mart or the successful use of technology to create new market opportunities for office furniture maker Herman Miller). He has also written about the FAA's halting attempts to modernize air traffic control, and in 2003 he traveled to Sierra Leone and Liberia to report on the role of technology in United Nations peacekeeping.David joined Baseline prior to the launch of the magazine in 2001 and helped define popular elements of the magazine such as Gotcha!, which offers cautionary tales about technology pitfalls and how to avoid them.
 
 
 
 
 
 

Submit a Comment

Loading Comments...

Manage your Newsletters: Login   Register My Newsletters