<img alt="dcsimg" id="dcsimg" width="1" height="1" src="//www.qsstats.com/dcs8krshw00000cpvecvkz0uc_4g4q/njs.gif?dcsuri=/index.php/c/a/Projects-Management/Virtualization-Beyond-the-Buzz&amp;WT.js=No&amp;WT.tv=10.4.1&amp;dcssip=www.baselinemag.com&amp;WT.qs_dlk=XRDYnzp74UHj4G28DYDe-AAAABU&amp;">

Virtualization: Beyond the Buzz

By Michael Vizard  |  Posted 2007-03-12 Print this article Print

It's the hottest topic in enterprise computing, but the real action lies in emerging technologies.

Virtualization has become the hottest buzzword of the moment in enterprise computing. After all, just about everybody is interested in saving money on hardware by increasing their server and storage utilization rates. In fact, in a survey published last month in CIO Insight, a sister publication to Baseline, the technology area that respondents said would get the highest percentage increase in spending in 2007 was virtualization of servers and storage.

But the interesting thing about the trend toward virtualization is that it is rapidly expanding beyond the technologies we currently associate with either virtual operating systems, which allow us to run multiple application stacks on the same processor, or the storage tools that allow us to partition a storage array to support multiple applications. Instead, people are beginning to talk about virtualization with a capital V as a larger trend that now includes concepts such as file clustering software for servers and thin provisioning software for storage.

In the case of file clustering software, the need for such products generally arises from the fact that virtual machine software from companies such as VMware, a unit of EMC, doesn't do a particularly good job of handling the I/O requirements of database applications. So, while VMware or something akin to it is pretty handy for consolidating file servers, it does little to contain the phenomenon known as SQL sprawl, which results in SQL database implementations being deployed on dedicated servers.

For example, Woodforest, a regional bank in Houston, has been experiencing rapid growth thanks to a deal with Wal-Mart in which Woodforest provides a lot of the banking services available in Wal-Mart stores. In order to consolidate servers and maximize availability while maintaining flexibility, Woodforest relies on VMware to consolidate file severs but uses file clustering software from PolyServe to consolidate database servers.

According to Lynn McKee, Woodforest's vice president of database administration, and Richard Ferrara, senior vice president for technology and infrastructure, this dual approach gives Woodforest the flexibility it needs to support even demanding situations, such as the wave of debit card transactions made during the Christmas shopping season.

The virtualization trend doesn't stop there. Companies such as MySpace have embraced thin provisioning of storage assets as the next logical layer of storage management above virtualized storage. MySpace relies on thin provisioning products from 3PARdata; however, the thin provisioning concept, in which data is dynamically stored across a pool of virtual storage, was pioneered by DataCore Software.

While thin provisioning has been a little slow to catch on in the enterprise, the fact that large-scale Internet sites such as MySpace have embraced the technology bodes well for adoption among enterprise customers. For instance, DataCore, which has a software-only approach to thin provisioning that company chairman and CTO Ziya Aral likes to refer to as "Virtualization 2.0," has already tailored the price points of its product suite to make its offerings more appealing to corporate customers.

Clearly, we're well down the path toward expanding the concept of virtualization beyond products such as virtual operating systems. What's significant is how a whole class of products are about to combine to create the foundation for a model of enterprise computing that will let us manage I.T. assets from a much higher level of abstraction. It's unclear how that higher level will eventually be managed, but what's almost certain is that fewer, more powerful systems will do the work of hundreds of others, and the number of people we need to manage those systems should be substantially reduced. Furthermore, for the first time in industry history, we will be able to separate the deployment of new applications from the purchase of hardware, because hardware will have evolved into a pool of resources that can be accessed on demand versus being a set of physical devices tethered to specific applications.

As is often the case with an over-hyped buzzword, none of this is going to magically happen overnight, but we're a lot closer to making it a reality than most people think. Within the next two years, we'll be looking not only at a fundamental change in the way we allocate hardware resources, but also at significant alterations to the fabric of enterprise computing that will change everything from the way we budget for hardware to the way we think about application deployment, security and disaster recovery. All of this can be significantly improved by a robust virtual computing environment.

Michael Vizard is editorial director at Ziff Davis Media's enterprise technology group. He can be reached at michael_vizard@ziffdavis.com.

eWeek eWeek

Have the latest technology news and resources emailed to you everyday.