2010-01-11

Tempered Sensationalism.

While looking through my pictures I came across a picture of Håkon Wium Lie on my flickr feed. The image was snapped at the Go Open conference a while back. At the time a fierce debate was taking place about OOXML being fast-tracked through ISO to become a standard.

Since I liked the picture and I do not particularly care for standards that are too big to be really useful (more on that later) I decided to post it to Reddit. I have to say that the response was a bit greater than I had anticipated.

I'd like to clarify a few things though.

Lars Marius Garshol brought to my attention that the picture of the "printout" of the standard that I added in the comments on the flickr page is in fact a fake. Those are blank pages that have been put in binders.

It looks like about 6000 sheets: each binder seems to hold the equivalent of 2 packs of printing paper. One pack of printing paper holds 500 sheets, so each binder would have 1000 sheets of paper. 6 binders would then indeed be 6000 sheets, but who in their right mind would print this single-sided? A real printout would most likely be double-sided, meaning that the stack would probably be half of that.

So yes, the printout can be said to be an exaggeration.

I got (from Lars and others) a link to an actual printout that seems to confirm this: http://xmlguru.cz/2007/07/czech-comments-ooxml

Another thing I'd like to clarify is why I think OOXML is a waste of time.

If you are going to create a standard for something your primary focus should be to address the problem domain in the best possible way. It should not be to merely document a particular set of legacy technologies. In particular not when this forces excessive complexity upon implementors which strictly speaking is neither beneficial nor particularly useful.

If I were to design a standard for a file transfer protocol I would not, under any circumstances, start off by considering FTP a baseline that the standard must contain. Indeed, it is a widely used protocol, it is old and it has, in a sense, an aura of "authority" -- but if you have ever tried to create a reasonably complete implementation of the FTP protocol, you have no doubt discovered that this is a rather painful exercise.
Of course, you can blame this on the way it is documented and argue that better documentation would fix things, but wouldn't it be a lot better to start over? Do you really want users to be forced to think about active versus passive connection modes -- or to endure software that is so dumb it forces users to understand these concepts? Also: people have had good ideas in the area of file transfer since FTP. (And file sizes have grown to a point where transfer errors have become more of an issue).

Then of course there is the sheer size of the standard.

This one is really simple: the likelihood of there being a thriving flora of complete and reasonably correct implementations sharply decreases with the size and complexity of a standard.

There is absolutely no way around this. Sorry.

The very reason the Internet protocols were so successful was because they were simple and they were driven by implementation. Not only that, they were driven by the fact that it was possible to have multiple implementations of the same protocols without it costing a trillion billion dollars to achieve. The reason X.400, the OSI stacks and SGML are not mainstream technologies today (or in many cases even viable) is because they were too complex.

What good is a standard if the only managable way to use it is to use a subset of it?

To me OOXML is a big, chunky standard that is too preoccupied with the past. I hope I will never have to deal with it.

No comments:

Post a Comment