by Dan Neel

The utility computing promise

news
Apr 12, 20027 mins

Tapping into computing resources on an as-needed basis has plenty of benefits for enterprises -- but success depends on maturity of foundation technologies

Tapping into compute resources with a simplicity equal to plugging a lamp into an outlet has been a goal of pervasive computing efforts from the start. Known as utility computing, the idea is to provide unlimited computing power and storage capacity that can be used and reallocated for any application — and billed on a pay-per-use basis.

Already present in a variety of capacity-based pricing models, utility computing is poised to expand throughout the enterprise as various key technologies — such as Web services, grid computing, and provisioning — intersect. Growth of utility computing in the enterprise will deliver to the industry not only equal access to supercomputing resources, but also new revenue streams for commercial data centers, new application pricing models based on metered use, and an open computing infrastructure for companies with little or no standing IT maintenance budget.

“Companies can begin to look at computing as a utility right now by virtue of the fact that utilitylike services are already available to them, including capacity-based pricing models for overhead infrastructure that is placed onsite,” explains Bill Martorelli, vice president of enterprise services at Hurwitz Group in Framingham, Mass.

Martorelli points to the wide range of utilitylike service offerings from companies such as IBM, Hewlett-Packard, Sun, and Compaq, in which additional servers, storage devices, and printers are placed on site with customers. The customer is charged for the gear only when it is turned on and used.

As something of a first step toward the vision of utility computing, IBM recently penned a $4 billion deal with American Express to provide all of the financial service company’s technology infrastructure as a utility, managed and maintained by IBM. American Express expects to save hundreds of millions of dollars during the life of the seven-year contract.

Dev Mukherjee, vice president of strategy for e-business on demand at IBM in Armonk, N.Y., says that for American Express, Big Blue’s compute utility service is “less a technology thought than a liberating thought,” and that the American Express deal is only the beginning of an industrywide trend.

“I would describe utility computing as a wholesale shift in the computing industry,” Mukherjee says.

“We are moving to where we will have a gridlike compute environment where all these capabilities — whether they are infrastructure capabilities like storage or databases or special equipment that allows you to do vector processing — will all be available in the grid, and anything a customer needs they will be able to get on demand across the grid,” Mukherjee adds.

Ingredients in the mix

Utility computing on a global scale will require the continued evolution and convergence of core technologies spanning Web services, grid computing, broadband, storage virtualization, automatic provisioning, change management, and security.

Martorelli says early utility computing efforts, such as the American Express/IBM relationship, will begin as in-house projects and follow the same pattern of emergence as the core technologies.

Jeff Gilliam, West Coast region president of Electronic Data Systems (EDS) in San Ramon, Calif., says he is beginning to see increased demand for utilitylike computing technologies inside companies struggling to cut IT costs and consolidate resources.

“A lot of companies are looking at utility computing models, even if they are just looking for a more accurate charge-back mechanism for billing their own departments for the compute power they are using,” Gilliam says. EDS recently licensed MicroMeasure utility computing billing software from Mountain View, Calif.-based utility computing technology company Ejasent. With MicroMeasure, EDS customers can begin to shape their business processes into a more utilitylike model while getting better control of distributed resources within their own network.

Of the emerging technologies that will support extended utility computing between companies, Web services will likely play the largest role, says Yogen Patel, vice president of marketing and product management at Ejasent.

“I think Web services will be the primary driver of utility computing,” Patel says. “When you think of Web services, it is going to be an electronic cloud where there will be machine-to-machine interaction without human intervention. And if you think about the computing fabric that’s required to support the massive amount of processing that is going to be needed to enable Web services, it’s going to be a utility model.”

Hewlett-Packard has been advancing its UDC (Utility Data Center) initiative for several years, and is currently focused on auto-provisioning technology that will enable utility computing, according to Nick Van Der Zweep, director of marketing at HP’s Always On infrastructure division, headquartered in Palo Alto, Calif.

Compaq is also working on dynamic reallocation of resources as part of its Compute On Demand Initiative introduced last July.

“We’re working on what we call ‘dynamic reallocation of datacenters,’ which is the ability to move applications and operating systems between the individual hardware boxes,” says Joe Hogan, worldwide managing principal for Compaq’s global services and outsourcing, based in Houston.

This type of work is vital to ensuring that servers and storage devices are properly flushed out before hosting another company’s job.

“You have to make sure your data is secure and is not getting into the hands of the next customer that uses the infrastructure,” HP’s Van Der Zweep explains. “It’s one thing to provision something and turn it on, but when you deprovision it you have to clean it all up, make sure there are no security loopholes left behind.”

Efforts to enable automatic provisioning will be shored up as software companies such as Tivoli and Computer Associates begin creating application topologies for their products that can be updated and changed in real time. Progress being made by companies such as Avaki and Platform Computing to forge grid computing standards will also extend the reach of utility computing fabrics that drive an economy of Web services, Martorelli says.

Planned utilitarianism

As for possible bottlenecks in a global compute utility, officials at Palo Alto, Calif.-based Sun see broadband technology as less of a worry than secure provisioning or advancement of Web services.

“Datacenters that are linked across the world from one another are almost becoming a nonissue in terms of cost compared to other components of an infrastructure,” says Chris Kruell, group marketing manger of enterprise systems products at Sun.

Although utility computing on a nationwide or global level is still many years out, companies can begin laying the groundwork in their computer networks that prepare their organizations to more easily integrate into a utility computing environment.

A recent study on utility computing by Forrester Research in Cambridge, Mass., suggests IT buyers begin looking at networkable servers and storage devices that will easily attach to future utility computing fabrics. For example, the report states that direct attached storage arrays “are cheaper than networked storage today, but will be hard to attach to the fabric years from now.”

For now, time seems to be on the side of IT. Estimates for the arrival of global utility computing extend out as far as 10 years with some analysts. Although component technologies such as Web services and automated provisioning will mature far more quickly, experts believe the real hurdle will be cross-vendor cooperation.

“It’s safe to say we are still at the early adopter stage of utility computing,” says Compaq’s Hogan. “But the promise is here already in the ability to dial up and down compute resources, and in the current economy that’s important.”