Good. I'm glad this question has come up. The web services model forces us to completely discard the notion we've got into our heads that an application is a packaged software artefact. It's not. An application is not a virtual entity, it's what you get done in the real world as a result of using software. You hope that using software means that you can do that real-world thing better, or faster. Remember this the next time you look at a bookshelf full of software packages that you never got round to installing. Don't think of them as applications. Recognize them for what they are boxes of useless software.
Once we have separated the concept of 'application' from the concept of 'packaged software' we can see how web services will change our attitude to application development. Because if those two concepts are not synonymous, then it follows that you don't have to be a software developer to create new applications. You can just as easily be someone who pulls together a number of different web services components to fulfil an application need that you have in your day-to-day routine. In the web services world, application development and application integration shade into each other and both of them assemble components using loosely coupled techniques that make reconfiguration and reassembly accessible to non-developers.
In the short term, there is a downside to all this. Unleashing non-developers with the power to build their own applications will lead to some ugly results, just like the early days of desktop publishing gave rise to some awful page designs, and just like the early days of web publishing gave birth to some dire web sites. But those newly empowered communities gradually mastered the skills they needed, and that is exactly what will happen in the coming months and years of web services application publishing.
posted by Phil Wainewright 9:10 AM (GMT) | comments | link
Wednesday, April 17, 2002
Why the Google API is important
Let's cut to the chase here, courtesy of Dave Winer:
"Could Amazon implement a money-making SOAP interface? Without a doubt. Same for Yahoo and eBay."
IT strategy consultant Doug Kaye has published a draft of a methodology for determining When to Dive into Web Services. It provides a useful framework for considering this decision in terms of cost and value, but there's a significant flaw in applying the model to web services projects. It works when applied to a pre-defined, known project in the classic plan-build-deliver mould of traditional IT. But the extra flexibility that comes when you assemble component-based web services in a loosely coupled configuration introduces a third dimension that isn't shown on the chart. I'm not sure of the right words for this (I'm thinking on the hoof here), but the scale relates to business process needs (or opportunities), and the axes are labelled something like 'ease of introduction' and 'extent of business utility'. Take the Google API announcement I mentioned yesterday, for instance. Say for instance half a day's programming will allow you to add a feature to your marketing department's intranet that shows them the most popular search terms used on Google to locate products in your market sector. The next day they've updated keywords on the website and can continue doing that on a daily basis to fine-tune its success rate in the rankings. You've reached ROI within days, and yet SOAP-based search engine APIs are still top left in Doug's diagram.
PS: Half a day is already an overestimate. A Google search this morning for 'Google API' reveals that PHP code to access the Google API is now in the public domain. According to Doug's framework, at this rate the O'Reilly book should be out by the end of the month (who knows, maybe it will?).
posted by Phil Wainewright 12:46 AM (GMT) | comments | link
Tuesday, April 16, 2002
Google API shows the way ahead
Prompted by an inadvertent leak of the news the previous weekend, last Thursday (Apr 11th), Google released the beta of its SOAP-accessible API, described by Blogger rival Userland on its Google API page as "may be the most momentous release of SOAP or XML-RPC support so far." Writing as it happened in his personal weblog, Userland's Dave Winer rated the event on a par with the 1995 release of Netscape Navigator. Yet despite the leak and the subsequent launch, the mainstream online tech media didn't report the story till late on Friday, as Rael Dornfest documents in a detailed account of the API in his O'Reilly Network weblog. Although it is currently only a beta trial, the introduction of the Google service is indeed a momentous event in the short history of on-demand web services. The episode demonstrates that the mainstream media is not to be relied on when tracking emerging technology and business trends precisely because by the time they are being reported there, they aren't emerging any more. That's why this site will have a variety of sources, including weblogs, feeding its news section when it comes to launch early next month.
posted by Phil Wainewright 3:08 PM (GMT) | comments | link
Microsoft's strategy for web services infrastructure
Of all the announcements coming out of TechEd last week, the most important concerned Microsoft's relationships with companies that it now describes as application infrastructure providers (AIPs). It has named an elite crew as Gold certified AIPs and also announced an important deal with Cable & Wireless acquisition Exodus. I've outlined in my column this week on ASPnews why I think all this is so much more important than its U-turn over .Net My Services: "Microsoft can achieve the virtually the same effect if everyone builds their infrastructure on its products, leaving it owning the architecture as a result of customer choice rather than supplier dictate."
posted by Phil Wainewright 1:20 PM (GMT) | comments | link
McAfee.com deploys web services, grid computing and P2P
The trouble with trendy new technologies is that they carry a strong whiff of 'toys for the boys' that leaves many people wondering what the real-world applications will be, if any. So hats off to anti-virus and security service provider McAfee.com for launching its SecurityCenter today, the client piece of a Web-wide security system that is built using web services and grid computing. Peer-to-peer functionality will be added later, completing a hat trick of buzzwords. With almost 1.5 million subscribers to its services already, McAfee.com already has the scale to turn its Grid Security Services into a powerful and compelling demonstration of what can practically be achieved by deploying these technologies. GridComputingPlanet has a detailed explanation of how it will work.
posted by Phil Wainewright 11:56 AM (GMT) | comments | link
Web services security: no longer an issue
Perhaps people will stop bleating about web services being full of security holes now that IBM, Microsoft and Verisign have got together to release a specification for WS-Security. Though the existence of a standard will still not prevent hapless users and administrators from leaving their most precious online assets exposed through routine flaws in their daily routine. I have never found nebulous fears over security to be a convincing objection to implementing new technology. There is certainly plenty of scope to implement web services in low-risk environments, such as secure intranets or extranets, or for non-sensitive applications. So most enterprises have plenty of places where they can start using web services if they are so minded, irrespective of the current lack or otherwise of security standards for the architecture.
posted by Phil Wainewright 3:26 AM (GMT) | comments | link
Assembling on-demand services to automate business, commerce, and the sharing of knowledge