The Growing Appetite for Virtualization
The Tasty Baking Co. is making a major IT overhaul. In the process, innovations in server virtualization and storage consolidation are saving the company considerable dough in energy costs and on many other fronts.
With $250 million in annual revenue and 4.8 million baked goods produced daily, Philadelphia-based Tasty Baking is constantly evaluating the cost, efficiency and effectiveness of its IT enterprise resources. Marketing teams need a strong network in place as they promote the company’s more than 100 products, which include the Tastykake line, a fixture in kids’ lunchboxes for decades. Warehouse staff needs the technology to track what’s coming in and what’s going out. And, with about 1,000 employees, the company’s HR, finance and other internal departments depend on their computers running well.
A year and a half ago, the company concluded that it could do better business with less hardware through server and storage virtualization and consolidation, which greatly reduces the number of physical devices needed. Using products from NetApp, Tasty Baking’s network operations now run on just 10 servers, down from the 40 required previously.
This has resulted in sharply lower energy bills. Infrastructure efficiency has increased, too: Tasty Baking is experiencing up to 70 percent memory use of each of those servers; previously, servers operated on as little as 10 percent capacity. The company is also finding that its IT operations are better able to adapt to the speed of business today.
“At first, I thought, ‘OK, we’re reducing the amount of hardware in our buildings, and that’s good,’” says Brendan O’Malley, vice president and CIO. “But virtualization also has allowed for us to operate in a much more automated environment. In the past, if we needed to change part of our IT architecture, it took forever.
“Now, people can tell IT what they need, and IT can do it much more quickly. For example, if we need to change our customer databases, it can be done in a couple of days, while it used to take six to eight weeks.”
Better Performance, Less Energy
More companies are discovering similar results. Essentially, virtualization allows several physical servers to be consolidated into a number of virtual ones that operate out of a single box, resulting in improved performance, less energy use and reduced cooling costs. Multiple single-server machines are replaced by a virtualized server (often a blade server).
More than a dozen servers can be consolidated into one, reducing the amount of space needed to run networks and cutting energy expenses. With virtualization, the location of the server is inconsequential because the technology allows a now-scaled-back primary data center to service many network operations in numerous regions.
Based on power consumption alone, many companies can justify the cost of virtualization and consolidation: U.S. data centers’ energy use more than doubled from 2000 to 2006, when it reached $4.5 billion, according to the U.S. Environmental Protection Agency. Under current conditions, that figure will hit $7.4 billion by 2011.
For every dollar spent on new server hardware in 2007, more than 50 cents was spent on powering and cooling costs, according to IDC, a Framingham, Mass.-based research firm.
“Many organizations now realize that as much as half of the electricity they purchase is going into the data center,” says Jim Barclay, CEO of Business Value Consulting Group, a Royalston, Mass.-based firm that evaluates the value of major IT purchases. “Not only is the cost of electricity growing, but it’s unpredictable. Business CFOs do not like unpredictable. They’d rather pay too much in the beginning and not face too many surprises along the way.”
One way to handle the electricity problem is with virtualization, which also provides speed and flexibility. “Virtualization allows you to bring new virtual servers online in a matter of days, not months,” Barclay explains. “Without it, a company with a great product line and an idea can miss a desired market window because its IT can’t respond in time. With virtualization, it can meet the early portion of that window.”
Server consolidation customers are saving from 25 percent to 45 percent in utility costs alone, according to a 2008 survey of 800 global corporations conducted jointly by research firm Information Technology Intelligence and security software provider Sunbelt Software. And virtualization/consolidation continues to build momentum. Blade servers account for just over 10 percent of servers sold today, IDC reports, but will account for more than one-quarter of all server shipments by 2011.
“Besides being good for the environment, virtualization is a darn good business strategy,” says Mark Lutchen, a principal in PricewaterhouseCoopers’ advisory practice and author of Managing IT as a Business.
In a recent survey of senior executives in five key regions, PricewaterhouseCoopers found that energy costs are paramount. In fact, 60 percent of the respondents said potential energy savings represent the most important factor in their companies’ environmental decision making.
Mergers Lead to IT Consolidation
Over the last decade, corporate consolidation has created the need for server consolidation. Mergers and acquisitions have resulted in decentralized technology scattered throughout the world.
“Now, for a variety of reasons, companies are compelled to standardize and bring these disparate technologies together,” Lutchen says. “Why buy a server every time you implement a new system? You’ll have 600 servers, and many of them will be used at just 30 percent capacity.”
The merger of telecom companies Alcatel and Lucent in late 2006 is a prime example. Even before the merger, the two companies planned data center consolidations. Now operating as Alcatel-Lucent, the company has its consolidation plan well under way. It is using Hewlett-Packard-supplied consultancy services and HP power-saving servers, resulting in the consolidation of 25 data centers and 125 server rooms into six data centers and a very small but still-to-be-determined number of server rooms.
Another HP customer, Mitel Networks, is saving $300,000 a year by reducing its servers from 12 to two and its processors from 24 to 12. And Pfizer is saving considerable but still-to-be-determined costs by reducing data center space by 40 percent, consolidating 220 servers into a dozen.
Alcatel-Lucent is anticipating similar success. Through virtualization and consolidation and the phasing out of legacy applications, the company will cut its server count from 10,000 to 7,000. There have been challenges: Some employees in various departments aren’t accustomed to having their designated servers outside their building—much less possibly in another country—and culturally, there’s always the resistance to change. Despite these issues, the effort is proving worthwhile. For starters, the company expects server-utilization efficiency to increase to between 60 percent and 70 percent, up from between 10 percent and 30 percent.
“We want to push virtualization as wide and deep as we can,” says Cliff Tozier, vice president of infrastructure for Alcatel-Lucent, based in Paris. “We can no longer afford for every application to have its own environment. Virtualization has allowed us to dynamically increase capacity in days or hours. In the past, it would have been weeks or months.
“We also are seeking to be more flexible in meeting business demands because of the tremendous growth we’re seeing in video and messaging across the enterprise, and virtualization has allowed us to do this. We can [provide] services on-demand so developers can get quicker access to computing power for projects.”
Meeting the Challenge
Despite the many benefits of virtualization and consolidation, there are inherent challenges, including additional costs for training and consulting. For certain, there is a learning curve in adopting these tools.
“You need trained staff that understands the technologies and can manage the transition from today’s data center to the virtualized one,” says Justin Perreault, general partner with Commonwealth Capital Ventures, a Waltham, Mass., consulting firm. “These skills are still in scarce supply. It’s the old problem of trying to replace the engine while you’re driving down the road at 60 miles an hour.”
Richmond, Va.-based Genworth Financial, a Fortune 500 global financial planning company, recently decided to make the transition now. It was running out of data center space because of server sprawl, and the cost of power and cooling in that space was too costly.
Working with virtualization/consolidation tools from companies such as VMware, HP and Dell, Genworth is now seeing a dramatic increase in productivity: It has reduced its average underwriting cycle time—from when a client application is received to when a policy is issued—from 45 days to 10 days. An added benefit is that the switch hasn’t negatively affected network operations or security, and, in some ways, it has benefited those areas.
“With respect to security, the patching, hardening, password and anti-virus procedures are the same for virtual servers as they are for physical servers,” says Michael McGarry, Genworth CTO. “Virtualization allows us to build out the environment quicker and enables us to put more servers in smaller spaces than we ever thought possible. It eliminates tape backups for disaster recovery, and individual backups are provided in a much smaller footprint.”
Virtualizing the Endpoint
Consider this scenario: A manufacturing executive is caught up in a huge inventory/accounting project that must be finished by the next morning. But 5 p.m. is approaching, and the executive needs to get home to take over parental duties while the exec’s spouse heads out for a PTA meeting.
Fortunately, thanks to continued developments in endpoint virtualization technologies, it’s easier than ever for the executive to access the needed Excel file and other work-based applications from a laptop at home in order to complete the job on time.
Seeing the value in this and other benefits to endpoint virtualization—which essentially allows applications to be added and integrated within an enterprise with considerably fewer system glitches than ever—more IT decision makers are buying solutions that encompass this technology.
In fact, more than three out of four of the approximately 300 IT administrators who responded to a recent survey said their organizations had already launched some form of endpoint virtualization, according to Symantec, which commissioned the study.
“That response surprises us,” says Brad Rowland, director of enterprise marketing for Symantec’s endpoint virtualization line. “This technology is just starting to be included in larger solutions. Before, this was made available as a point-by-point solution that had to be managed separately. Now, it’s part of a larger platform, and that’s leading to broader acceptance.”
Here are some additional findings:
• 24 percent of participants said the simplification of operating system and application delivery is the most appealing aspect of endpoint virtualization, while 20 percent said lower IT costs offered the greatest appeal.
• 31 percent reported that their organizations spend at least 21 percent of IT resources on managing incompatibilities between applications on endpoint devices.
• 36 percent said that at least a quarter of their entire 2009 IT budget is earmarked for endpoint virtualization initiatives.
A Virtualization Covenant
At Covenant House Toronto, about 4,000 young people a year can stay for weeks or months at a time. Many of them have left their homes—or moved to Canada from other countries—without a means of support. Here, they can get meals, medical attention, counseling and even vocational training.
Like any organization with 200 employees, Covenant House has concerns about keeping its data protected and accessible 24/7. For example, fundraising accounts for 80 percent of its budget, and data related to that effort is maintained on the network. That’s why it recently completed a major storage virtualization project using DataCore Software’s SANmelody as implemented by DataCore partner Interware Systems.
As a result, Covenant House has more than doubled its available storage capacity, while reducing the number of physical servers from 10 to three using VMware. At the same time, the organization has increased its ability to safeguard its data from disruptions and disasters, and it has reduced the amount of power needed to keep enterprise operations running.
“This assures that we reach our goal of 99.99 percent availability for our users 24/7,” says Wendy Craig, who oversees information systems. “It allows us to manage our infrastructure much more easily, too. Before, it could be difficult to oversee a bunch of physical servers with storage spread all around, especially when the servers break down. Now, because our server and storage management is centralized in one location, we can deal with all problems from the IT manager’s desk, without needing to take down the entire system.”
After deciding on a virtualization solution, Covenant House centralized data at a downtown Toronto location, replicated a copy of that data and stored it at a co-location facility a few kilometers away. Data related to everything from the latest fundraising figures to day-to-day e-mail exchanges is replicated every night at the co-location facility. Moreover, DataCore optimized the organization’s existing storage assets by addressing the problem of having some application servers over-allocated with storage, while others were underserved.
Next up: a desktop virtualization project using virtualized server technology that will allow traditional desktops to be replaced with thin-client hardware, saving a projected 97 percent in energy bills.
“It’s all part of our green computing perspective,” Craig says. “We can be just as productive and save a lot of money with this technology.”