Nicholas Carr’s article (and now book) “Does IT Matter?” caused a stir when it was first published. Some aspects of Carr’s argument — that information technology is a more mature industry than it once was — made sense; other points — that somehow innovation is dead, there’s nothing new under the sun, and all the technology industry faces today is an unending vista of cost-cutting and cutthroat commoditization — were at best unprovable and more likely dead wrong. (Chad Dickerson had some good commentary on Carr here and here.)
Carr wrote a perfectly reasonable op-ed for the New York Times last week about Microsoft’s humongous dividend give-back as an indication of the company’s middle-age. I agree with much of the piece, but one passage caused my jaw to drop ground-ward:
|Software never decays. Machinery breaks down, parts wear out, supplies get depleted. But software code remains unchanged by time or use. In stark contrast to other industrial products, software has no natural repurchase cycle.|
Software never decays? Carr is a Harvard Business Review veteran, and I assume he works with computer software every day, as most of us do — but I can’t imagine such a sentence being written by anyone who uses a personal computer or runs a software-dependent business (which means virtually any business today) for any extended period of time.
In the abstract, of course software doesn’t decay the way a pair of garden shears grows dull from use or an automobile engine loses compression over time. Abstract code shows no frictional wear. But the notion of “decay-free software” is as divorced from everyday reality as the notion of a “friction-free market”: Both exist only in the vacuum-space of the professional economist.
In truth, while well-written software can often lead an extraordinarily long and fruitful life (I am storing the fruits of two years of book research in a 2 megabyte Ecco Pro file, in outlines composed in a program that has not been upgraded or modified since around 1997), most software today begins to rot from the moment of first use.
And the most notorious piece of decay-prone software is the one Microsoft’s billions are founded on. Windows begins to accumulate barnacles of cruft in the registry the moment you first crank it up and try to use it to do anything. If you are a typical user, after two or three years of regular use your operating system will be grinding to a halt, crushed by the weight of the junk your various applications and Windows have together conspired to scatter across your directories. I know plenty of people who choose to buy a new computer not because they necessarily need some new hardware feature or upgrade but because they have given up on trying to save an ailing Windows installation — and reinstalling Windows is enough to send most people screaming toward the nearest Dell ad.
So while Carr may be technically correct — that software code does not “decay” the way a blade dulls — he is, by any pragmatic view, dead wrong. From the user’s perspective, software almost always decays — it stops doing what you want it to do, or you try to do something it is supposed to do and find that you can’t. The more you use it, the more likely it is to break, because the more likely you, as a cantankerous and unpredictable human being, are likely to do something the programmers haven’t imagined you would do.
Tim O’Reilly’s writings about software as a service outline this basic truth: Most software today “has people inside.” Software decay is so universal that every piece of software needs a corps of developers to keep the rot in check — whether you’re talking about a Web service like Amazon or Google, where the programmers deliver upgrades through the Web site; or a custom business application, where the programmers work for the software company or the client company; or a consumer application, where the programmers provide users with a constant stream of patches and upgrades to keep the bugs at bay.
This is a bad thing if you are dreaming of a world of perfect software. But it certainly keeps a lot of programmers employed. And it’s a more natural model for a world in which we don’t expect perfection, but hope for steady improvement.