Skip to main content

Is programming...analysis done backward?

Posted by timboudreau on February 11, 2009 at 8:46 PM PST

(This is another in my emptying of my drafts box of proto-blogs - this one from four or more years ago. Some of it is dated, and I gave it an edit or two, but I think there might be a thought or two in here worth reading).

Software is what architecture would be if building-materials were free. Everybody would have a beautiful, twisting crystal palace in the air, and nobody would be able to find the bathroom - Jesse Glick

Ever stop to think how we do software? When analyzing the real world, people start with existing phenomena, then backtrack to one or more models that describe it - each is an abstraction of the physical phenomena. Then, applying Occam's razor, the simpler explanation wins. That is why you can explain the entire solar system as everything orbiting the earth, and the math will work - but you don't do that in practice, because it's much more complex that way. We call that science.

Then there's this thing called computer science. Interestingly, it doesn't work the same way.

This work is done by people, and human beings, in the sciences, humanities, or in economies, build status hierarchies and compete for places in them. We're primates. It's just what we do. So that complicates things.

On fuzzier subjects (say, Shakespeare), the Occam's razor approach doesn't offer much benefit. You're starting with an artifact of culture and attempting to analyze it in the context of other, well, artifacts of culture (Edward O. Wilson crystallizes the problem brilliantly in Consilience But there is a desire to understand the subject, and to analyze it. Thus we have the field of study of literary theory (a subject I wasted too much of my youth on). Because you can never get Shakespeare down to something as simple as observations of the planets, you can never know if your analytical approach is useful or not (if you could, we would have IT thesis projects like "Write a program that will write Shakespeare's next play, after all the ones he really wrote" - and they would all produce the same play).

Any study of fuzzy subjects tend to have a lot more fads, politics and schools of thought that come and go, simply because it can and there's no place else to go - nothing's provable. There is a discipline called computer science, but it falls far, far short of the things people need to get done some way or other in today's IT world (say the phrase "halting problem" in a room full of people trying to write the next YouTube - you will be an annoyance and they are right, you aren't). If you studied literature any time between 1978 and 1997, think structuralism, semiotics, post-structurialism, post-modernism (if you don't know these terms, count yourself lucky and don't waste your time). A school of thought becomes popular; people compete for status by either using it, or carrying it to further extremes; this provokes others to carry it to still further extremes to try to gain status; eventually it collapses under the weight of its own absurdity and the next fad catches on, and the process repeats. The ebb and flow of religious fundamentalism throughout history has similar patterns.

All of this is possible because in the humanities, nobody is ever going to prove they're right anyway - what better environment to compete for status in, than one where nobody will ever prove you don't have status!

(Note that the study of business is a soft, not a hard science. There is just as much unresolvable complexity in an economy as in a Shakespeare play - it's just as fuzzy, maybe more so. But it just does involve some numbers, so it's easier to pass it off as a hard science, and parts of it, when isolated to the point of not being able to predict anything useful, are hard sciences - math is math).

The proven best practice for figuring something out is to look around, seeing what's happening, and start putting things in boxes. The boxes are abstractions. Eventually you've got a pile of abstractions in some kind of organization, and you can either try to predict the future with it, or try it out on similar things and see if the results map to reality. If it works, you earn some money or status or something along those lines (and if your lucky, it happens within your lifetime). If you really do a good job, it might just raise the quality of life for more people than just you, because everybody's got a new tool that will work consistently.

Then there's software. Yes, there are cases where you write software that is a 1:1 mapping between the world and the software, and its totally testable. But that's usually the exception, not the rule. And usually a 1:1 mapping between the problem and the software that solves the problem is completely unusable - some of the most avid users of the NetBeans Platform are people who adopted the Naked Objects approach to design with entirely predictable and disasterous results - the sane description of a problem to a computer is very different than the sane description of a problem to a person - and confusing the two is a seductive trap.

Very, very often in software, we start with the abstraction! We aren't observing the universe and discovering principles, we are god, creating the universe. Before you have an implementation, define your interfaces. This is the polar opposite of scientific analysis, where you start with a complex world, and find abstractions that consistently describe it. In software, you start with abstractions, and then fill in the complexities. You may derive the abstractions from a mental model based on observation, but you're still starting with the abstractions. And if the complexities - what the software really has to do to fulfill its purpose - don't fit the abstractions well, you get performance problems and other nastiness - because in the end, there is a reality, and it's what work the computer is actually going to do when your software is run. But you're not starting from what the computer is going to do - unless you can afford a lifetime to cogitate on one problem so that you really can think it all through and work backwards from the root problem - and outside of well funded foundations, most of us don't have that luxury.

But isn't it all a little weird? In software, often you're not starting with the world - you're creating a world for other people to live in. People building frameworks (like Java) or operating systems compete on whose world is better (never mind that better for the programmer and better for the customer can be two different things). And that competition is more often defined by religiously held views than by sober analysis. So everybody goes out and tries to get people to religiously hold views favorable to them. If you've created a world, you've got to get people to go live in it, particularly if you hope to make them pay for the privilege, which is the typical way you manage to create a world and feed your family at the same time.

The reason for all the weirdness is, it's not a thing that has a template. Human beings haven't been in the business of creating worlds for very long. As the last vestige of the Roman empire, the Catholic church is the closest thing we have to a reliable template for effective abstraction creation (please don't think me a conspiracy theory believer or anything similar because of that metaphor - sometimes a cigar really just is a cigar!). We're not good at it yet. We don't know the best practices for world creation, if there even are any (and anybody who knows them is either long dead or not about to share them if they aren't). Witness the huge number of web application frameworks out there. Which one is the best? Well, the best at what? It's not quite as unsolvable as proving Shakespeare is better than Hawthorne. But it's close.

So it's a bit like every computer programmer has been given the task "Create the Catholic church, from scratch, and prove it will still be around in a thousand years before you start, if you want to be funded." Fertile ground for charlatans, but somewhere in there, some folks are probably going to do some stuff that does work, and will be observed to work and copied. But it doesn't happen fast and we started from zero within the last 50 years.

The real situation is, we're finding the rules as we go (with businesses that bet wrong as the casualties - but this is how market economies build infrastructure - through failure - consider the U.S railroads, most of which were built by companies that went bankrupt [but left behind railroads] or ATM [bankomat] machines built by competing banks that consolidated).

The closest analogy might really be competition between various religions - and in fact, you get identifiably religious behavior - things like Microsoft bashing, obsession with not breaking object encapsulation, Free Software or Extreme Programming are the faiths. That still isn't a situation human beings have traditionally been in - Extreme Programming doesn't tell you if you should use Windows or something else; object- or aspect- oriented programming doesn't say do or do not use Extreme Programming here. There is no One True Way. So the complexity of the available worlds to live in approaches the complexity of the real world, except they don't map 1:1. Is it any wonder this industry is a bit chaotic? Is it a wonder this creates a market for freakish "software methodologies" (I believe there is one software methodology and it has three steps - hire smart people, give them a problem, and get out of their way while they solve it - but try to sell that to a $7/hr HR recruiting agency...).

The result of all of it is that software has all the benefits of science, and all the handicaps of the humanities. There is an inevitable time delay between creating an abstraction and having it prove itself effective or ineffective - you have to ship the software, maybe several times, before you have real evidence. In that gap, people will behave as they do in the humanities - nothing is provably good or bad, and people will be faddish, place their bets and behave accordingly. Creating a thing and proving a thing are not the same. Right now, the software business emphasizes creating over proving, because there are not enough things that really are proven, and it's a business - you have to place your bets somewhere.

There never was any other way to go.

Related Topics >>