It's time to disabuse people of the notion that utility computing is stuff that comes out of a socket in the wall, just like electricity. This very common misconception is founded on a false analogy, and it's one that leads to some dangerously misleading conclusions.
The false analogy I'm talking about goes something like this. It starts innocuously enough with the observation that power points and network points are often found side by side, in the wall or beneath a hatch in the floor. From this, it's very easy to make the erroneous premise that, just like electricity from the power points, you could have computing coming out of those network sockets. A few more steps of faulty logic then rapidly bring you to the conclusion that, since electricity comes from massive, centralized power-generation plants, so should computing.
It would suit certain computer vendors if that were how utility computing is going to work. They are the vendors who sell massive, centralized computer systems. It would suit their customers, too, if utility computing really did obey such a straightforward, centralized distribution model, because that would give them a stranglehold over pricing and supply. But it's not going to be as easy as that.
The truth is that computing isn't at all analogous to electricity. It doesn't radiate out along spokes from a central hub, distributing its output to the periphery (nor, actually, in this post-industrial age, does electricity distribution need to be so centralized but that's a separate argument). Computing can happen anywhere on the network. It already is distributed. The role of the network is to establish the communications links that allow it to be shared. The sockets in the wall are for two-way traffic.
Electricity provides a more accurate analogy if you look at the applications of electric power. Raw power is converted into heat, light or motion only at the point of delivery. Nobody (unless they live right next door) expects the utility company to supply their hot water from the power station. People prefer their own individual appliances, chosen and regulated to meet their specific requirements. The notion of delivering word processing (for example) to your desktop from a central computing utility is almost as absurd as the idea of illuminating your desk by having the power company shine a light down a wire from the power station.
On the other hand, there are economies of scale to be had from, for example, manufacturing popular appliance designs in their millions, and from standardizing their interfaces such as adopting a single design for the jacks that interconnect audio devices, or producing audio recordings to a standard format that can be read by all audio players.
All of these proven techniques combine to make the utility power grid successful because it has found the right balance between connectivity, choice and standardization. This allows the centralization of those elements that benefit from economies of scale, namely power generation and appliance manufacturing. Just as importantly, it distributes those elements that benefit from individual customization, namely the deployment, configuration and operation of individual appliances.
The same is going to be true of utility computing, with the exception that there is going to be much more scope for achieving economies of scale at multiple levels. Instead of thinking in terms of monolithic computing services, think of choosing among an almost limitless universe of service options. Connecting to the wall socket will open up access to a global market, in which every resource can find its most efficient level. For some resources desktop productivity software for example mass distribution of retail packages will remain the most cost-effective model. For others a really obvious example is web content search a single, centralized resource will provide unbeatable economies of scale. Much more significantly, there will be innumerable examples where small, specialist shared resources will find a market for example, online information providers who focus on emerging tech industry sectors (or so we at Loosely Coupled hope).
The utility element of utility computing, then, is the provision of the infrastructure that enables this resource-sharing. It's going to be more complex and sophisticated than the electric power or telecoms utility infrastructures. It's going to have to evolve to be a lot more mature than what we have at the moment. I think Nick van der Zweep, HPís director of virtualization and utility computing, made an important point earlier this month in InfoWorld's article, Getting down to grid computing when he told Ed Scannell that, "Right now grids are just APIs, and the management systems available canít reach in to understand what is going on inside of them." Web services management products will play a significant role in monitoring and policing the grid infrastructure that comprises utility computing.
But be clear on one crucial point. Utility computing will never be about the provision of applications out of a wall socket. The utility providers will operate the infrastructure. But the applications will sit on top. Rather than being a component of the infrastructure, they will be delivered across it by independent providers.