to homepage
 Weekly emails: how to advanced search
 Glossary lookup:


Loosely Coupled weblog

Friday, January 03, 2003

Standards and unorthodoxy
There is a paradox at the heart of web services:

  1. They work because they use established, universal standards

    and yet

  2. They produce the best results in unanticipated circumstances.

We expect standards to be about certainty, but web services standards build a foundation for constant innovation and change.

This tension between standards and unorthodoxy is likely to reach crisis point in 2003. The past year was notable for an almost unseemly rush to develop and propose web services specifications and standards. It is as though we all feel that, once the standards have finally been hammered out and agreed by everyone, we will know where we stand and will be able to relax at last.

That notion is illusory. As InfoWorld's Ephraim Schwartz has just pointed out in a column on this very topic, most standards organizations are merely "alliances used to increase a company's sphere of influence and/or to keep a rival group's influence in check." Vendors are eager to establish standards becase they want to stay in control of web services, and their customers support those efforts because they, too, yearn for certainty in an uncertain world.

A different picture emerges if we look back at what really happens when significant new interoperability standards emerge. HTTP over the Internet brought the commercial Web into being. The addition of RSS to that mix turned weblogs into a powerful channel for amplifying discourse. 802.11b has created an unanticipated blossoming of WiFi hotspots and ad hoc networking. None of these results were predicted (or even expected) by the creators of those standards.

Reviewing the practical deployments of web services in 2002, there's been little in the way of heavyweight enterprise deployments, mainly because enterprises still regard the available standards as immature. But there have been plenty of casual or serendipitous discoveries and experiments. One of the best examples was Jon Udell's experiment in joining up URLs from multiple sources based on ISBN numbers. He's just published a new account, The disruptive Web, in which he sums up the ingredients which he believes contributed to its success:

"Support HTTP GET-style URLs. Design them carefully, matching de facto standards where they exist. Keep the URLs short, so people can easily understand, modify, and trade them. Establish a blog reputation. Use the blog network to promote the service and enable users of the service to self-organize. It all adds up to a recipe for recombinant growth."

Tellingly, Jon makes the use of GET-style URLs the starting point of his advice. Even though 2002 ended with SOAP 1.2 entering the final step towards ratification as a W3C standard, the most striking practical innovation during the year has been achieved by those who've gone back to an earlier, simpler web services standard.

The paths trodden by enthusiasts in 2002 give us pointers to what early adopting enterprises are likely to encounter in 2003, and the results are not reassuring for those who look forward to an easier life once web services standards have been finalized. Even standards that are supposed to be universally agreed and ready for adoption are likely to come under renewed scrutiny, while the paradox of web services means that the simplest standards will produce the most powerful results when deployed to unorthodox ends.
posted by Phil Wainewright 2:56 PM (GMT) | comments | link

Assembling on-demand services to automate business, commerce, and the sharing of knowledge

read an RSS feed from this weblog



latest stories RSS source

Headline news RSS source


Copyright © 2002-2005, Procullux Media Ltd. All Rights Reserved.