Grid Computing: Too Good to Be True?

The benefits of grid computing sound so good, you wonder why everybody’s not using it.

Grid software makes collections of computers more efficient by allowing them to share CPU cycles, memory and other resources so closely that they act almost like a single computer. Depending on whose numbers you believe, grids are 25% to 50% more efficient than standalone servers, and 20% to 30% cheaper because less of each server’s capacity is wasted, and administration is easier

Grids have been popular in scientific applications, most notably the Search for Extraterrestrial Intelligence’s SETI@home project. But they’ve never caught on in the mainstream corporate world

True, New York investment bank J.P. Morgan has launched a project in which it is paying $4.5 million to consolidate seven risk-management applications onto a grid that will eventually include 2,000 blades in 50 servers. Morgan says in published reports that the project saved $1 million in 2003, may save $5 million this year, and could eventually cut the company’s overall computing bill by 20%.

But that result is rare among applications at large companies, which rely on large databases processing transactions quickly, not hard-core analysis, which make them unsuitable for the parallel requirements of grids, says Nick Gall, senior vice president and principal analyst at the Meta Group.

Still, the potential cost savings are attracting a lot of attention, according to an International Data Corp. study. Though only 5% of respondents have utility computing structures in place, the study showed, 11% plan to add some within two years.

The cost calculator associated with this article is in development. We apologize for any inconvience.