Forget service-oriented. It's contracts that matter. Without contracts, on-demand is a limitless commitment. That's why vendors are so keen on the on-demand concept it obliges customers to buy lots more computing power to service every conceivable demand. Prudent customers should adopt a more measured, contract-oriented approach to implementing on-demand SOAs.
Take a look at some of the recent tie-ups between services management and utility computing vendors, such as between MetiLinx and Digital Evolution, or HP's acquisition of Talking Blocks. The idea is to be able to automatically bring more horsepower on stream to pre-empt a service failure, as in this example provided by HP's Nora Denzel in a CNET News.com story:
"For example, the Talking Blocks and OpenView software can determine that a given number of servers will not be able to process stock transactions by the end of the day and feed that management information to the Utility Data Center software, which can automatically dedicate servers from a less time-sensitive application to the more pressing task, she explained."
This is very much the traditional view of how service level agreements should work. IT provides the technology and an assurance that it will work 99.97% of the time, and business takes the resource for granted. But who's monitoring the cost of this arrangement, and whether it's justified in business terms? You can see why vendors are so happy to encourage it, because it institutionalizes the unquestioning purchase of additional reserves of computing power to meet an arbitrary technology performance benchmark. But there is another way.
Forget service level agreements. What business really needs are process level agreements commitments to sustain certain business processes to agreed performance levels. A process level agreement is more than a simple statement of 'feeds and speeds', such as providing 'x' number of packets 'y'% of the time, as is typical of traditional service level agreements. Instead, it commits to ensuring the successful completion of a process in 'x'% of instances, and to a graceful recovery or fallback in the remaining instances.
As outlined in last week's Loosely Coupled article, Measuring services in a business context, this level of sophistication is just beginning to be explored by a few early adopters of customer-facing web services. It's a great deal more complex than the old-style SLA approach of underwriting the maximum achievable feeds and speeds. But organizations that get it right will find it much more cost-effective and competitively potent than the old approach.
posted by Phil Wainewright 2:23 AM (GMT) | comments | link
Wednesday, September 03, 2003
HP raises the ante
HP's purchase of Talking Blocks, announced today, is a shrewd move in the high-stakes poker game with IBM and CA to dominate the market for services management software.
Recall that, when OASIS met to discuss web services management in late July, Talking Blocks was on the other side of the table from HP, joining with IBM and CA to put forward an alternative view to HP's own web services management framework (see IBM, CA square up to HP on management). HP seems to have liked their presentation so much that, to borrow a phrase, it bought the company.
The move strengthens HP's credibility in the web services space in two very important ways (actually, make that three):
Buying a specialist vendor brings valuable hands-on experience of dealing with web services management issues into the company.
It both demonstrates a willingness and acquires a credible capability to accommodate the other points of view expressed at the WSDM meeting.
The all-cash deal shows a determination to put its money where its mouth is.
Now that HP has started the ball rolling, expect other larger vendors to start acquiring some of the smaller fry in the emerging web services management sector. But none of them will likely close as sweet a deal as HP has managed to pull off today.
posted by Phil Wainewright 12:02 PM (GMT) | comments | link
The search engine is a victim of its own success. It's still better than anything that came before. But users are getting increasingly frustrated with their inability to find what they want using Google. There are two reasons why this is happening:
Users have come to depend on Google so much, they expect it to deliver a perfect search result every time, even when that expectation is unreasonable.
More and more website owners understand the importance of optimizing their sites for Google, which increases the number of matches that are found for each search.
I've been exposed to the results of these two factors by a recent bout of home improvement work. Google never has been particularly good for tracking down local specialist suppliers of mass market items. Combine that with the effect of widespread investments in search engine optimization, and a search for "lighting"+"London"+"UK" or "bedroom furniture"+"London"+"UK" produces a results page that's dominated not by retailers' websites but by online directories.
But look down the right-hand side of the page and you'll see that most of the adwords text ads are actually retailers. So the most accurate results are now delivered by adwords, not by the search engine itself. How long will it be before optimization renders adwords ineffective?
There's a network effect in operation here that doesn't bode well for Google's continued leadership of the search engine business. It's a simple matter of network economics. The more popular Google becomes, the more important it is for website owners to gain a high ranking. This is a numbers game that Google can't win. Google's ranking algorithms are a centralized system. It's inevitable that, sooner or later, it will succumb to a tipping point at which the number of autonomous independent agents trying to outwit it will always triumph, simply due to the sheer weight of their random numbers.
This is not a challenge that's unique to Google, of course. Each development of the network is like peeling an onion. There's always another layer of decentralization and automation to go through. Yahoo! succumbed because its human operators couldn't categorize the Web fast enough. Now Google is succumbing because its human developers can't devise algorithms to outwit the smarter nodes fast enough. Adwords has been a great holding operation, because it introduced a completely new, complementary system alongside the first one, which has bought Google some extra time. But eventually the effectiveness of Adwords will degrade too. In the meantime, its role as Google's main revenue generator has the potential to divert resources into shoring it up rather than evolving a third-generation solution.
What shape might that third-generation solution take? I'm not sure that it's possible to automate the development of algorithms, because you need to have some way of machine-reading what people are looking for. Maybe we need to go to a new semantic layer of search terms, whereby instead of searching, for example, for "web services", you can search for "web services" @SOA@, whereby the @...@ notation denotes the semantic context of the search term. I dunno, this is just a stab in the dark from someone who's not a search engine guru.
The one thing I know for sure is that Google has peaked. It'll still do well at finding answers to questions for which there is only a single, authoritative source. But we're soon going to need something new as a means to search the web for everyday things that everyone needs.
posted by Phil Wainewright 8:58 AM (GMT) | comments | link
Assembling on-demand services to automate business, commerce, and the sharing of knowledge