More WS-* specs, more questions about architectural viability
It's as regular as the seasons: as the leaves start to fall from the trees here in Michigan, more web services specifications flutter down from WS-IvoryTower, and more hunters take up their rhetorical shotguns to blast at them. This year it is WS-Transfer and WS-Enumeration , Microsoft has a new web services architecture vision
document to accommodate them , and a number of people (Tim Bray is perhaps the most visible) have already wondered why we need Yet Another web services spec that appears to do something the Web already does.
I'm in somewhat of the same quandry that Tim is in, since there are an awful lot of smart people on each side of the fence here. (I recall a W3C Web Services Architecture working dinner when I was the only person at the rather long table without a Ph.D!). Likewise, I agree that the WS-* stack is monumentally difficult to keep track of, that there are some very visible success stories for "web services" that don't use WS-*, and that "design by committee" is not likely to be successful.
But also like Tim, I get the same sorts of feedback from Day Job colleagues explaining how enterprise developers really *do* need more than the raw XML and HTTP technologies to build enterprise-class distributed applications.
But one thing Tim says really helped me to understand why there are such diverging opinions among smart and well-informed people. " Im deeply suspicious of multiple layers of abstraction that try to get between me and the messages full of angle-bracketed text that I push around to get work done ". Ahh, I think I see it now -- the "XML and the Web suffice" people push around XML to get their work done, and don't have the pain that WS-* tries to cure. The "WS-* is needed" folks, on the other hand, are working on the behalf of people to whom this is more or less unthinkable. "Real" people building enterprise applications of the sort that the WS-* specs target work with objects, databases, transaction monitors, and reliable message queueing systems .. which they need to make more accessible via HTTP *gateways." They don't have the option of simply exposing these systems as stateless Web resources with whom one exchanges XML respresentations, without a massive amount of code rewriting or adapter building, and there is nobody writing books or selling tools to assist them. There are, however, reams of whitepapers, articles, and books explaining how to apply the WS-* approach to the problem, and a considerable amount of success to show for all this.
On the other side, I don't think that very many of the people pushing simple XML over HTTP really have to solve nasty enterprise integration problems to earn a living. (Sean McGrath is a notable exception, by the way, but I recall (but can't find the link) that he on record as noting that the problems that WS-* addresses are real even if solutions that have been built by committee are not likely to actually work). So, I'm hypothesizing that the WS-* bashers are those who find XML and Web technologies the sharpest tools in their toolbox, and think of the problems of linking them with enterprise transaction systems to be an implementation detail; the proponents are those who have to expose enterprise systems over the web, and think of HTTP as just another transport and XML just another serialization. It's not a matter of one side being right and the other wrong, it's a matter of which tools are needed for which jobs. Tim Bray's point about Amazon, eBay, etc. not needing the WS-* stuff to get their job done is well taken, but it's also quite clear that these were built from the ground up to work with the Web, whereas the fertile ground for WS-* are the enterprise systems that were not designed with the web in mind.
What I'd really like to see here is the application of a well-known conflict management technique in which each side has to state the other side's position, to the other's satisfaction, before discussing the disagreement. For example, the new Microsoft web services architecture document praises and aligns itself with the Web in its introduction:"Web services take many of the ideas and principles of the Web and apply them to computer/computer interactions. Like the World Wide Web, Web services communicate using a set of foundation protocols that share a common architecture and are meant to be realized in a variety of independently developed and deployed systems. Like the World Wide Web, Web services protocols owe much to the text-based heritage of the Internet and are designed to layer as cleanly as possible without undue dependencies within the protocol stack."
So, why don't the Web standards suffice for computer/computer interactions? People have been talking past each other on this topic for years now. How about it? Maybe Mark Baker could re-state his understanding of why Don Box thinks they don't ... and vice versa ... before wrapping this permathread around the blogosphere one more time.
At a minimum, I'd like to see documents such as the Microsoft one address the well-known critique that the Web specs already hit the 80/20 point, and be much more specific on why they think these can't be simply extended to handle computer/computer interactions. I'd also like to see the RESTifarians explain and give examples of how they think goals such as "autonomous services" and "managed transparency" can be achieved without building something like the WS-* framework.