Tim Bray has yet another must read piece that apparently emerges from the collision of his deep understanding of XML concepts with realities he experiences at Sun. According to Tim, today's web service best practice is built on the foundations of asynchronous XML messages, produced with the assistance of programmer-friendly tools that hide the arcana of SOAP and XML (but not their basic principles). The grand theory expresses in the WS-* stack of proto-standards consists of interesting but unproven concepts including composable message architectures, dynamic service discovery, and declarative application building powered by metadata rather than procedural code. That theory is not well-grouned, however: 'This essay is about theory and practice, and we ought to have learned by now that standards are terrific when applied to proven industry practice but high-risk in the domain of theory and science. ..Here at Sun Ive talked to a lot of people internally and worried out loud about the WS-* stack: whether it will actually work, whether it will interoperate, whether the complexity will be tractable, whether it will be secure, whether its going to have patent lock-ins. Mostly, they say yep, yep, yep, we worry about those too.'
Of course, the mainstream XML community lives in a glass house and should be careful about throwing stones at the web services people. Tim mentions the XML Schema spec as a victim of Design by Committee, and XQuery is often mentioned as another example; of the thousands of pages of XML-specs coming out of the W3C in the last few years, how many are really based on best practice? The Semantic Web specs are an interesting case; they contain the best practice that has survived over several generations of theory (LISP parantheses mutating to XML angle brackets along the way!), but I have no idea how truly field-tested they are. I'm actually becoming somewhat optimistic that we're seeing the inverse of what Bray talks about there -- people are discovering that they have been stumbling around the concepts that are captured and formalized in OWL, and are taking a look at how to apply them to domains that nobody thought of as needing an "ontology". We may be seeing one of those rare cases where academic theory successfully does precede practice.
The metaphor I like here is "intellectual capital." The internet and the Web looked like overnight successes (and committees fairly quickly established the core Web standards) during the 1990s, but they drew on at least 20 years of academic and government R&D. The industry seems to have learned the wrong lesson, believing that it was the committees writing the specs that created the intellectual capital upon which the Web was built, and hoped that more committees would keep the momentum going during this decade. I suspect that the capital formation occurred in a lot of research labs and email lists and conferences over a decade or two, and the standards committees just refined and refactored out what was relatively common.
The community of web services developers should certainly keep doing what they're doing to lay down the next layer of intellectual capital, and not be deterred by the skeptics and fundamentalists who think that XML+HTTP will always suffice. Consumers of this stuff, however, should be extremely wary, realizing that WS-* is about the future, not for today. People with real projects to plan simply have to judge for themselves which of these approaches and technologies are Best Practice (ideally with a solid theory behind it), and which are Best Guesses of self-appointed experts. My take on today's reality is very much in line with Tim's -- "XML, HTTP, URIs, SOAP, WSDL, and thats about it."