When and why are interoperability fests useful?
The world of Web Services has thrown up a range of various interoperability workshops aka plugfests; not to mention a whole organisation dedicated to interoperability. You might get the impression that because Web Services are about interoperability as much as internet-scale computing, such things have not been of interest in other distributed systems such as JEE or CORBA. But interoperability events do occur elsewhere. However, it is true that the approach to interoperability we're seeing now is markedly different from what we saw in the past: for most of the key players, interoperability is at the forefront of specification and implementation development. If you look at CORBA, it took 7 years or so for the OMG to address the shortcoming and things are still not perfect; and true heterogeneous JEE-to-JEE interoperability is a thing of the future.
Both CORBA, JEE, DCE and (implicity) COM/DCOM, were dominated by vendors keen to maintain vendor-lockin. Fortunately (or unfortunately, depending on your perspective) this couldn't continue and even before the rise of Web Services we were beginning to see change: the "norm" of sites running homogeneous environments changed, with companies growing by acquisition or wanting to do real vendor-to-vendor (business-to-business) interactions. No longer was the argument "take our XYZ product now and we'll work with you to have eventual interoperability with ABC" sufficient. Many large deals have fallen through because of the lack of interoperability.
However, it was certainly the case that interoperability was still considered of secondary importance. To a degree, that is understandable: you can't worry about interoperability until/unless you have a product. However, I believe strongly that interoperability testing should be considered as important as standard unit testing and QA: it shouldn't be an afterthought.
To explain why, I'll use the Web Services interoperability fests as an example, but as I said before, this isn't (or shouldn't be) technology specific. If you've ever looked at various Web Services specifications, such as WS-CAF, WS-SX, or WS-Addressing, they're not exactly easy reading material (JEE and CORBA specifications are similar). Understanding why something is intended to work the way it does is often as difficult as understanding how and is definitely as important. As well as product development, I've been working in standards and specifications for over 10 years and two people can read the same specification and come away from it with completely different perspectives. In most cases that's a problem (read: bug) with the specification that should be caught early on. And this is where previous standards efforts, such as JEE and the OMG, fell down: more often than not, specifications were developed months or years before implementations; for example, in order to ratify a specification, the OMG only requires companies to say they will eventually use it, not that they have implemented it.
Now although the same is true for Web Services (e.g., OASIS requires at least 3 committee member organisations to say they are using a specification, though it doesn't have to be in product), the whole "Web Services are for interoperability" mantra has really taken hold. Before any Web Services specification is standardised, the various committees have at least one workshop where they work through interoperability between heterogeneous implementations and feed those results back into the specification. Obviously depending on the results, this can be an iterative process, but the end result is usually something that offers better out-of-the-box interoperability. Therefore, from a specification development perspective, these workshops are incredibly important and would be beneficial in other arenas.
Obviously not everyone can be present at these official interoperability fests and anyway many implementations arise after standardisation has occurred. Plus, standards are still not perfect and can often be deliberately ambiguous, leading to the possibility of non-interoperable implementations. That's why it is so important to do interoperability testing during development: iron out bugs in the specification or in your understanding of what was meant by the original authors. If it is left until after the product is shipped, then it may be difficult or impossible to make changes without causing problems for end-users. That's another route back to vendor-lockin and bridging protocols. Furthermore, feeding bugs back to the standards bodies will benefit the next version and other users: try arguing with one vendor that your view of an ambiguous specification is correct if they've come to a completely different (and incompatible) conclusion!