The End of the Beginning?
I love Java. I love writing Java code. I've even
written a Java book. I've used zillions of
programming languages and Java is the one I like the best. But there's a question that's
been nagging at me lately: Does Java, or any programming language, really matter any more?
been in the computer business for a very long time - the first computer I ever worked with used
punched cards and was as big as my first apartment - I've seen a lot of changes in the nature
and public perception of computing. And I can't shake this feeling that we're in the midst
of a profound, evolutionary change.
In his now famous article
Doesn't Matter, Nicholas G. Carr makes the
argument that the IT industry is maturing and has become effectively irrelevant to a company's
competitive strategy. In and of itself, Information Technology no longer has any intrinsic
competitive advantage. It's simply one of the essential ingredients needed to run a modern
corporation - like electricity, telephones and foosball tables.
But my nagging feeling is not
so much about business trends as it is about similar evolutionary forces acting on the technology of
software itself. Carr refers to earlier technologies that have transformed industry (steam engine,
railroad, telephone, etc). Throughout their lifecycles, each has initially transformed - or
disrupted - their environment and then themselves been transformed because of the changes they
I believe software development is entering such a mid-life, transformative
stage - it's changing because of the changes it's caused in business and society.
are no longer exotic, science-fictional wonders, they're a part of our everyday lives. And, as
the off-shoring trend of recent years has made excruciatingly clear, programming is no longer a
rarefied skill exclusive to a gifted few. To a large extent it's become a routine trade that
almost anyone - anywhere - can perform. A commodity.
But the shifting economics of earning a
living as a programmer is not what I'm talking about either. That's a political and
sociological discussion. The thing that's nagging at me has to do with the programming itself.
The way we as humans communicate what we want a computer (or a system of computers) to do.
has historically been inextricably linked to computer hardware design. Assembler languages gave us a
one-to-one translation of mnemonic symbols to the numeric codes that a CPU understands. Higher level
languages brought a more semantically rich set of symbols, and a more complex compiler to do the
translation, but we're still essentially expressing ourselves in terms a CPU can understand.
technology helps us conceptualize things a little better. We don't feel so constrained to the
lock-step, linear instruction execution model. But when you look inside an object method, it's
still made up of those fine-grained marching orders for a CPU. We still seem to be tied to the
paradigm of writing text that's translated to the CPU's native language.
wondering lately if this notion, abstracted though it may be, of people writing text that's then
translated into opcodes isn't something we've outgrown. It's the 21st Century but the
way we as programmers communicate instructions to computers hasn't changed fundamentally since
Grace Hopper started doing it nearly 60
I saw a demo at SD West 2003 of the work
href="http://www.intentionalsoftware.com/">Intentional Software is doing and have read a little
about the Ace and href="http://research.sun.com/projects/jackpot/">Jackpot projects at Sun, all of which are
exploring alternate ways of representing code. But I've actually been intrigued by this concept
since I decided to switch to href="http://www.intellij.com/">IntelliJ/IDEA as my Java development environment.
was the first IDE I used that was designed around the concept of language understanding rather than
text editing (it may not have been the first, it doesn't matter, it was the first one I
encountered). IntelliJ is such a great tool to use because it understands the Java language. It
instantly checks code as you type it both syntactically (is it well-formed) and semantically (does
this make sense in the context in which you're using it). It's the latter that's really
valuable - manipulating text is not what I care about, it's getting the code right that matters.
I got up to speed with IntelliJ, I noticed a subtle but profound shift in my thought process. I
began to think of coding less as a text editing activity and more as an object crafting activity.
Because the tool understood the language I could trust it to do the mundane editing tasks while I
concentrated on the higher-level meaning of the code. This has far-reaching implications.
than thinking of the code as lines of text, I now conceptualize it as chunks of functionality that
can be split, combined or reshaped as needed. The chore of making changes to the text files is
handled by the tool. I'm able to devote my concentration to the more important aspects of
programming: responsibilities, relationships, cohesion, coupling, patterns, and all the other stuff
that makes for well engineered software. And when I see that something needs to be changed I
don't hesitate because the tool can make the change easily - even if that change may affect a
hundred source files. This makes for better code and it makes me a better coder.
this use a sophisticated internal model to represent the code. Where there's a model,
there's usually a view of that model. And if there's one view, why can't there be
others? When a tool can present your code to you as UML diagrams, color selectors, tables, live GUI
widgets or whatever, the textual view we're all so familiar with starts to feel rather
So this begs the question: If tools can liberate us from mundane text editing
and we code so much better when interacting with a more wholistic model, why do we bother to keep
the text file representations at all? In today's world, it certainly seems as though building
programs by manually munging text files is one of the least efficient - and most expensive - ways to
go about it.
I don't know what the next great innovation in programming will be. It may be
here already for all I know. But I do know that the labor-intensive, text-based programming method
we've been using for the last half century will wane - it's simply not going to be
economically viable much longer.
A friend of mine once pointed out that the majority of
business programming is done not in Java, C/C++, Cobol or even Visual Basic, but in Microsoft Excel.
I wouldn't consider a spreadsheet a programming language, but it certainly is an effective way
for humans to communicate to a computer what they want done.
This, I think, is where the
future lies. Finding the most appropriate way to communicate for any given problem domain. The era
of humans translating their thoughts into sequential steps for the CPU to execute is ending. Just as high level
languages supplanted assembler because they were more expressive and hid more of the details of the
execution environment, I believe the next phase will be to move beyond the textual representation of
programming logic for most applications.
Of course, conventional programming languages are not
going away overnight. I suppose what I'm really talking about here is a new sedimentary layer
being laid down atop what's come before. The future is alway built on the foundation of the
past. The programming languages of today will beget the next generation of programming tools.
believe we've reached the end of the beginning of the computer revolution. But we're nowhere
near the end. Just as printing technology evolved from letterpress to linotype to offset to laser
printer, the nature of programming must also evolve. It's going to be interesting to see which
mutations survive in this brave new Darwinian world. In any case, I'll be sad when I finally
have to say goodbye to my old friend, Java.