Utility ComputingBy Baselinemag | Posted 2002-12-03 Email Print
Primer: Overcapacity is out, efficiency is in. This is good news for those who advocate utility computing, where users pay only for the bandwidth and applications they actually use.
Paying for computing the way you pay for heat and hot water. A company would be billed only for the processing power, network bandwidth and software applications it actually used.
It depends on the resource. Some technologies such as supercomputing or storage are ideal for a shared environment at a hosting center. Others, like software applications, require a piece of software that tracks usage. Companies reluctant to share hardware can choose plans that install reserve storage capacity, say, in anticipation of a spike in demand. The company doesn't yet pay for the reserve. Instead, when the extra storage is needed, a manager activates those servers. Another model lets enterprises create their own pool of resources to create a single virtual server.
Yes, but on a larger scale. A company using this approach could just as well run its applications on a utility computing vendor's site, as it might in the ASP model. But utility-based computing's intentions are far broader: Companies are able to take advantage of shared infrastructure resources, from storage to databases to Web servers, as opposed to simply outsourcing individual applications.
By running outsourced technology departments more efficiently on shared or virtual servers, for example than customers could. The utility vendors can still charge a premium, which the market will bear since overall costs will be lower than if the client companies were maintaining their own traditional computing infrastructure.
The usual gang of industry heavyweights. Sun has offered its Capacity on Demand services since the summer of 2000. Hewlett Packard has been banging the drum for utility computing for several years, finally centralizing its server and storage resources in a dedicated system called the Utility Data Center, introduced in the fall of 2001.
American Express has inked a seven-year, $4 billion computing contract with IBM, in which IBM Global Services will take over much of AmEx's technology infrastructure, staffing and duties, including Web hosting, help-desk management and data storage. Convenience-store chain 7-Eleven has a similar seven-year, $175 million utility-based contract with EDS for hosting, integration and intelligent storage services for its corporate data. And HP has taken over management of seven operations centers and 4,000 Windows servers in a usage-based arrangement for cell-phone maker Nokia.
Expect it to dovetail well with developments in grid computing and Web services. Over time, storage, databases and applications will increasingly be made available for customers to access on demand over networks that appear as one large virtual computing system what's become known as grid computing. Utility computing provides the needed charge-back function to support this business model.
Like Web services, which aims to let companies interconnect their software systems ever more quickly and cheaply, utility computing is ultimately about how companies can make better use of all their computing resources. By delivering broader access to network resources, utility computing extends an open computing infrastructure to companies with limited or strained technology budgets.