to LooselyCoupled.com homepage
 
 Weekly emails: how to advanced search
 
 Glossary lookup:

 

Loosely Coupled weblog


Friday, January 17, 2003

Distributed Yahoo! The next generation
PC software pioneer Dan Bricklin has just launched a scheme that could soon revolutionize a sleepy industry. It tackles one of the major problems of doing business in a major economy, which is the enormous effort it takes to keep track of who does what, and where to contact them. You only have to look at the $7 billion price tag that ailing phone company Qwest managed to negotiate recently for its QwestDex directory division to see how much value we attach to such information. Directory publishing is a lucrative business.

Dan's proposal, called SMBmeta, is an XML specification that a business can use to publish information about itself on its website. The advantage of using a standard XML format is that it can be read automatically by other computers. If enough businesses publish information about themselves in that way — and remain committed to keeping it updated — then it could have a huge impact on the economics of business directory compilation. Imagine the effect of being able to automatically collate ready-formatted, up-to-date information on thousands of businesses, instead of having to go through the time-consuming and still largely manual process of actively requesting, chasing, verifying and collating it. Economies on such a scale cannot help but induce inflection-point change.

Probably the most important potential impact is that it would slash the entry-level cost of becoming a directory publisher. So far, the Internet has not threatened the traditional business directory providers, as some had initially thought it would. That's because past solutions have still had to bear the same research and compilation costs as the traditional providers. Internet scale simply magnifies those costs. Even Yahoo! — which owes its initial success to its prime mover role as the directory of where to find things on the Internet — was finally overwhelmed by its dependence on a centralized team of manual compilers.

With SMBmeta, anyone with a web server and a small modicum of programming skill will be capable of setting up a business directory. More importantly, the directory will automatically update itself whenever any of the source material changes — setting up a directory in the first flush of enthusiasm has always been easy, but it's never before been so easy to maintain one.

This will be particularly attractive to web sites serving a specialist niche, and of course, what we've learnt from Google (particularly in the context of weblogs) is that if you have a plethora of sites serving many different interest groups, you can start to identify the most reliable sources by measuring link popularity. The best specialist compilers will be the most highly rated, and they in turn will provide a means of independently verifying the validity of individual entries, using automated Google-style algorithms.

So SMBmeta could be the solution to the problem that originally overwhelmed Yahoo!, by distributing the compilation task out into the Internet in a manner that allows the results to be continuously verified. It could even redress the competitive balance between Yahoo! and Google, which is one reason why I chose "The next generation" as the title of this piece. The other reason was its resonance with the title of a posting I wrote last month: Software, Jim, but not as we know it discussed the potential for unexpected new applications to spring up overnight, especially from the use of XML to turn content into structured data. I feel sure that SMBmeta will be a seedbed for many such applications.
posted by Phil Wainewright 10:14 AM (GMT) | comments | link

Wednesday, January 15, 2003

Bring down the walls
Apparently it's a bad idea if ignorant users like you or I can decipher how a URI works. That is, we shouldn't be allowed to look in the location bar of our browser and be able to see, for example, that http://www.google.com/search?hl=en&q=SOAP is a request to Google (www.google.com) to search (/search) in the English language (?hl=en) for the term SOAP (&q=SOAP).

According to Tim-Berners Lee, the original creator of the WorldWide Web and still head of its leading standards body, the W3C, that's too transparent. He — along with many other influential figures in the technology world — believes that "a Web application's URI namespace should be opaque," writes Jon Udell this week in his InfoWorld column.

The purpose of Berners-Lee's opacity axiom, Jon explains, "was to ensure that a service provider can always reorganize a namespace without fear of breaking clients that depend on that namespace." In other words, Google should be free, for example, to insert new elements in front of the &q= operator (which indeed it often does) — or even to change it to, say, &query= (which to my knowledge it never has) — without having to worry about the potential effect on systems that interact with it.

Heaven forbid, after all, that anyone should be able to link to Google's, or Amazon's, or any other provider's URI in ways that the system's designers hadn't originally thought of. That might lead to — horror of horrors — unintended consequences.

Fortunately, an increasing number of people are beginning to see that there are potential advantages in promoting transparency in URIs, in part prompted by Jon's experiments with his LibraryLookup project, as he describes in his column. He notes, too, that there is a perfectly viable means of ensuring clients continue to be supported when URIs change — transformation: "An URL-rewriting engine could continue to support old-style links, but transform them to the new style."

Transformation is a concept the W3C not only understands very well, but also actively promotes in certain areas. Jon (again), writing this time in his O'Reilly column, describes a neat way of converting a weblog's RSS output into outline format using the W3C's XSLT transformation service. It seems that the W3C believes unintended consequences are OK for content authors, but not for web application programmers.

Why am I not surprised to find myself arriving at this conclusion? A bitter struggle for control has raged ever since the beginning of the Information Age. Technologists have constantly sought to keep technology locked away from the mass of users, fearful of the damage that might ensue if some untutored individual were to attempt something untoward without a full evaluation of all the potential consequences. Yet it is precisely when control is wrested away from the established IT hierarchy that every major advance in application utility has been achieved.

After half a century, you'd think they'd have learnt by now that this is a struggle the proponents of opacity never win. Transparency and unorthodox experimentation triumph every time. Yes, the results are more 'fragile', less elegant and usually require a lot of iterative reworking and transformation (ultimately performed by skilled technologists) to make them robust and stable. But it is only by opening up the mechanisms to untutored neophytes that you unleash the creativity required to discover those new applications.

It is time to bring down the walls that surround the citadel of software automation once and for all. Resistance is futile: the walls are coming down anyway. Technologists can either help dismantle them from within, or else helplessly watch as the rest of us tear them down from the outside. Which side are you on?
posted by Phil Wainewright 2:32 AM (GMT) | comments | link

Tuesday, January 14, 2003

Network resident anywhere
Flamenco Networks yesterday launched a licensed enterprise version of its web services network, which was previously available only as a hosted service. The development is a natural progression that has already been foreshadowed in the experiences of many net-native application service providers. Some holdouts maintain that the only true home for net-native software is out on the Internet in the care of the original software developer, but the fallacy in that argument is to regard the Internet as something that stops at the enterprise firewall.

Larger enterprises in particular have Internet infrastructures within their internal networks that are at least as robust as anything that exists outside. Their data centers are just as much a part of the shared global Internet infrastructure, even if some of the resources housed in those data centers are not available beyond the firewall. The key characteristic of net-native software is that it performs interchangeably in either environment, rather than needing to be permanently shielded from the Internet like conventional client-server software.

Of course, there is a loss of economies of scale when enterprises host their own licensed copies of software that is also available as an online service. A service provider can pool infrastructure resources into a single shared-server instance, and has none of the additional support costs involved in catering to multiple instances installed in a variety of environments. Against this however must be set the higher security and redundancy costs of defending and maintaining a service that represents a high-risk single point of failure. There may also be a performance penalty when using an external service to provide infrastructure that, at least in the early days, serves users who are mostly based within the enterprise network.

One of the most telling giveaways of the ASP era was the reluctance of many leading practitioners to outsource any of their own infrastructure to outside providers. When performance is at a premium, infrastructure is best kept in-house, and most providers of ASP infrastructure software quickly abandoned any thoughts of offering their software as a service. They still nevertheless maintained a significant advantage over vendors who attempted to repurpose client-server infrastructure products for use in ASP environments, of which they had no real understanding whatsoever.

Flamenco has taken the right route by first learning the particular requirements of web service networking by running its software as a service under its own control. As a result, it can now productize that experience for customers who prefer to keep their Internet infrastructure assets behind their own firewall. Software that has been carefully designed from the ground up to be net-resident will feel equally at home in either environment.
posted by Phil Wainewright 6:38 AM (GMT) | comments | link

Assembling on-demand services to automate business, commerce, and the sharing of knowledge

read an RSS feed from this weblog

current


archives

latest stories RSS source


Headline news RSS source


 
 


Copyright © 2002-2005, Procullux Media Ltd. All Rights Reserved.