Everyone knows that Moore’s Law says chips will double in speed and/or halve in price every 18 months — right? But as with so many things that everyone knows, this is at best a wild oversimplification, and really just wrong. As Gordon Moore originally expressed his famous idea about the exponential progress of the semiconductor industry, the notion was that our ability to cram transistors into the same on-chip real estate would double at a regular interval (over time he tinkered with the length of that interval, from 18 to 24 months).
I was delighted recently to come across this piece in ExtremeTech — in which two analysts from Gartner discuss whether Moore’s Law has been, on balance, a blessing or a curse for the computer industry — because, among other things, the article gives us all one more reminder about the actual meaning of the concept.
This isn’t splitting hairs. The distinction is important because, if all Moore really said was that you were going to be able to make denser and denser chips, that left the industry with an existential choice of whether to use that capability to drive prices down or performance up. Moore’s principle did not map a straight upward path for his industry; instead, it laid out a crossroads, or rather a sequence of one crossroads after another.
The ExtremeTech piece is worth reading for its discussion of these choices of how to expend the Moore’s Law “bounty” (to borrow the fine word that Charles Simonyi applies).
Claunch added that: “if we were to make a PC run at the same speed as the original 8086 PCs, they’d probably cost about 10 cents to make. But nobody could afford to be in the business of selling PCs at 10 cents each. So instead, we have to use a different strategy. That different strategy is this: Pump twice as much stuff into the box, and if you do that, you can at least hold your price flat.”
But mostly I’m just grateful for the article’s refusal to oversimplify. Over the years I have had any number of arguments with editors so eager to apply Occam’s Razor and explain Moore’s Law “so anyone can understand it” that they’re willing to be inaccurate. (I worked hard to get it right in Dreaming in Code, which was largely about why software has had such a hard time keeping up with hardware — and thus why, even though we have chips unfathomably faster than those we used two decades ago, we rarely get our work done much faster.)
If there were a Moore’s Law for journalism, sometimes, in my more cynical moments, I think it would go like this: Every couple of years, our collective ability to maintain subtle distinctions and fine gradations of meaning collapses by half.