Wordyard

Hand-forged posts since 2002

Archives

About

Greatest hits

Big software: time to give up?

January 22, 2005 by Scott Rosenberg

On today’s New York Times op-ed page, Nicholas Carr of “Does IT Matter?” puts the FBI software meltdown in the context of other recent enterprise-scale software train wrecks like McDonald’s Innovate and Ford’s Everest (he could have dragged in the IRS, too). As everyone does who addresses this topic, he references the Standish Group “Chaos” report from 1994, with its dire statistics about software failure.

As with his previous arguments on the topic, Carr gets things just about half right: Of course the record of large-scale software projects, particularly those meant to replace existing systems that are functional but graying, has been awful, and the complexity of these systems remains daunting. Carr concludes that the complexity is so overwhelming we should give up on innovating in software and just concentrate on doing the same things we currently do more efficiently: “When it comes to developing software today, innovation should be a last resort, not a first instinct.”

He’s forgetting that, in the world of software, innovation is the primary way to add value. We move existing “off-line” systems and processes into software not only to make them more efficient, but to give them capabilities the physical world can’t provide. Thus online publishing isn’t just about delivering text and images more cheaply; it’s about connecting publisher and information consumer in new ways that change the whole relationship. Manufacturers implement inventory control systems not just to save money but to transform their businesses so they can build products when customers ask for them, rather than trying to guess what the market needs. If software isn’t providing new capabilities, why bother? It seems obvious that we’re a long way away from exhausting the possible new wrinkles software can offer business, government and society.

Carr is right that large institutions get into trouble when they try to replace big old systems and introduce complex new features at the same time. But his advice — give up on those new features, be happy with what you’ve got — is needlessly ostrich-like. The answer is not to abandon change but to structure change so that it’s not a big bang but an evolutionary process. The failures in so many of these software disasters don’t stem from ambition but from impatience and bad planning.

Filed Under: Dreaming in Code, Software

Rhymes with Mombasa

January 18, 2005 by Scott Rosenberg

Picasa, my favorite Windows photo-organizing software, has a great new upgrade from Google, which acquired the company a while back.

The funny thing here is that, though you will find Picasa referred to here and there as a “service,” it’s not really that; it’s an old-fashioned, standalone desktop application with a bit of sharing coated on top. I’m not stating that as a criticism — I love Picasa, and it’s helped me keep track of the absurd quantities of photos of my kids I’ve taken over the past five years. Much as I love Flickr, there’s no way I’m going to upload that volume of photos across the Net.

It’s just odd to think of Google, the locus classicus of the new world of distributed web-based computing, doing this sort of product, and just giving it away. John Battelle has more here, noting that there is no business model of any kind behind Picasa. That worries me, only because I’d really like to keep using this software for a long, long time.

Filed Under: Software, Technology

More core

January 13, 2005 by Scott Rosenberg

Thanks for all the thoughtful comments on my post about software keeping up with multi-core processing.

Jon Udell posts some more on the subject in response to a Register piece by Shahin Khan and a comment by Patrick Logan.

Systems with 7000 CPUs? Do we then need 7000 heatsinks? Or do we consider it a feature and throw away our heaters? (I know, he’s talking about a “miniaturized big-iron system” for users to share — but the hardware advances that go into the servers first usually end up on the desktop a couple of years later.)

Filed Under: Software, Technology

Multi-core competency

January 10, 2005 by Scott Rosenberg

Fascinating piece by Herb Sutter, The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software, says that, with Moore’s Law plateauing short of 4 GHz, and the processor universe moving to “multi-core” designs to squeeze better performance from chips, software developers are going to have to learn a whole new ballgame.

Predictions that Moore’s Law is going to hit a wall have regularly proven mistaken over the past decade or two, but that doesn’t mean that this time they’re wrong too, and the news from Intel et al. over the past year suggests that the stall in processor-speed increases is real. So the hardware firms’ “multi-core” plan means that the next generation of processor speedsters will try to gain their oomph not by running one processor’s queue of instructions faster — that’s become tough as higher speeds have meant more heat, more power use, and more energy leakage (all, obviously, connected phenomena) — but rather by running multiple queues.

In layman’s terms: If your corner store experiences huge growth in customer volume, it can keep its one cashier working harder and faster, but only up to a point. Once that person hits his limit, the only way you can move more customers out the door faster is by adding a second register. (Unless you completely change the rules, by, say, asking the customers to check themselves out — in this comparison, the technology equivalent of “invent a new processor paradigm” to bust open the Moore’s Law logjam once more.)

In my everyday example, the “coordination cost” is fairly low — you just have to assume that the customers will figure out how to organize themselves into two separate lines. Or maybe if your store’s set up the right way you can have one line feed both registers. To adapt software to the multicore universe, though, Sutter’s analysis suggests, the costs are more complex, and programmers need to get good at thinking about a new set of problems — otherwise software won’t be able to take advantage of the new chips, and programs designed by developers who don’t really understand the new world will fall into new kinds of traps like “races” and “deadlocks.” Sutter writes that “The vast majority of programmers today don’t grok concurrency, just as the vast majority of programmers 15 years ago didn’t yet grok objects.” So maybe there’ll be work for programmers after all!

Meanwhile, when Intel decides that multi-core is what the public must buy, look for it to push software vendors to rewrite popular applications in new versions marketed under whatever ad-friendly moniker the new multi-core architecture is festooned with. (We went through this with MMX in the mid-’90s and again, on a smaller scale, with Centrino.) “Multi-core” and “hyperthreading” are sexless technical terms, so we can expect trademarks like “Maxium” or “CoreSwarm” and slogans like “Two is better than one!” or “The Power of Many” (no, wait, that’s taken).

The typical user will say, “Why do I need this stuff? My word processor is fast enough and my Web pages load fine.” But within three years the new architecture will be standard anyway, and within ten years the world will actually find something to do with the new processor power — like, say, distribute the work of 23 million video mashup artists simultaneously to your desktop, then catalog them and re-edit them according to your preferences on the fly! And the Silicon Valley cycle will grind forward.

Filed Under: Software, Technology

Ecco unchained

December 14, 2004 by Scott Rosenberg

Ecco Pro — the outliner/PIM that I have written about periodically and am still using today, despite the fact that it has been orphaned by its owners and not modified since 1997 or so — looks like it may be released as open source. (Thanks to Andrew Brown for the link.) Whether this means that the heart of Ecco will be transplanted by enterprising programmers into some newer, modern body — or just that Ecco devotees will have an opportunity to tweak and debug the trusty application — it’s wonderful news, if it actually happens.

Filed Under: Software, Technology

Spolsky in Salon

December 9, 2004 by Scott Rosenberg

I’ve been an admirer of Joel Spolsky’s writing on software since I started reading it several years ago. Last month when I was in New York I sat down with Joel and had a good long talk about software development, partly for the purpose of my book research and partly because I knew he’d be entertaining and thoughtful. Today’s Salon features a write-up of the interview, pegged in part to the publication of a book collection of Spolsky’s essays.

Filed Under: Dreaming in Code, Personal, Salon, Software, Technology

Yahoo: Please fix MusicMatch!

September 15, 2004 by Scott Rosenberg

Now that Yahoo has acquired MusicMatch, maybe they can fix the software.

I am a long-term user, I’ve paid several times for the product over the years, its basic interface works for me better than iTunes, and I’m used to it, I don’t want to change. But: In trying to turn a good music client into a boffo music store, MusicMatch has repeatedly broken its software. Most recently, the thing crashes whenever I try to copy a CD to my hard drive. This same bug existed for months last spring, then MusicMatch finally fixed it — now it has reappeared in the latest update. (The “volume leveling” feature, which would be highly useful if it worked, has also always crashed.)

Frankly, MusicMatch, I don’t care about your store. I just want your software, once the best of its kind, to work.

Filed Under: Software, Technology

Fun with Flickr

September 1, 2004 by Scott Rosenberg

At the O’Reilly Emerging Technologies conference earlier this year I was lucky enough to get a demo of Flickr, the photo-sharing software and service from Ludicorp. (The company’s president, Stewart Butterfield, is married to Caterina Fake, who did great design work here at Salon several years ago.) At the time I thought it was a neat little photo-sharing tool, but it seemed a little heavy on the Flash, which sometimes makes my head ache, and life got busy and I never got around to exploring it further. Since then Flickr has won much acclaim, and when I needed to figure out a simple way to share photos from a recent family trip, I thought I’d give it another spin last night. Turns out it has evolved beautifully since my introduction to it, and I ended up playing with it for hours, so let me now belatedly add my enthusiasm to the chorus.

It’s an exquisitely well designed Web application, certainly one of the best I’ve ever seen, full of smart interface choices and nice little finishing touches that let you know that the developers who’ve built it are also heavy users of their own handiwork.

Tiny example: I noticed Flickr was dating the photos based on the date I uploaded them, so I went in to change a bunch of dates to reflect when the photos were taken. The page contained this helpful message: “The date posted is the date & time you physically published your photo on Flickr, not the date the photo was taken. We are currently storing the date that your photo was taken in the database, so rest assured you won’t need to modify every photo later… There will soon be a way to sort your photos based on the date the photo was taken. Stay tuned!” So I didn’t waste my time. That’s what I call a considerate piece of software. And along the way you learn that Flickr is respectfully storing each photo’s metadata (date, type of camera used, all that EXIF stuff that you almost never need to look at, except when you do).

It’s easy to get started with Flickr, and then when you want to push it and do more with it, it leads you gently into its depths. It has a whole layer of social software — profiles, groups, and so forth — but since its primary function is photo sharing, that social software actually has a raison d’etre, so you don’t just sit there (as with so many other ventures in this area) and wonder “Now that we’re here and we know each other’s hobbies and marital status, what exactly do we do?”

I am generally distrustful of using Web applications as anything more than conveniences for away-from-home access. I want my data close at hand, and most Web interfaces are still too clunky to allow for fast and complex organizing of serious quantities of stuff. But I’m seriously thinking about making Flickr my photo home base — it’s that good. And if Flickr’s speedy evolution in a mere six months is any indication, the thing is going to improve — and grow — at an intense rate.

Filed Under: Software, Technology

Software rot

July 27, 2004 by Scott Rosenberg

Nicholas Carr’s article (and now book) “Does IT Matter?” caused a stir when it was first published. Some aspects of Carr’s argument — that information technology is a more mature industry than it once was — made sense; other points — that somehow innovation is dead, there’s nothing new under the sun, and all the technology industry faces today is an unending vista of cost-cutting and cutthroat commoditization — were at best unprovable and more likely dead wrong. (Chad Dickerson had some good commentary on Carr here and here.)

Carr wrote a perfectly reasonable op-ed for the New York Times last week about Microsoft’s humongous dividend give-back as an indication of the company’s middle-age. I agree with much of the piece, but one passage caused my jaw to drop ground-ward:

  Software never decays. Machinery breaks down, parts wear out, supplies get depleted. But software code remains unchanged by time or use. In stark contrast to other industrial products, software has no natural repurchase cycle.

Software never decays? Carr is a Harvard Business Review veteran, and I assume he works with computer software every day, as most of us do — but I can’t imagine such a sentence being written by anyone who uses a personal computer or runs a software-dependent business (which means virtually any business today) for any extended period of time.

In the abstract, of course software doesn’t decay the way a pair of garden shears grows dull from use or an automobile engine loses compression over time. Abstract code shows no frictional wear. But the notion of “decay-free software” is as divorced from everyday reality as the notion of a “friction-free market”: Both exist only in the vacuum-space of the professional economist.

In truth, while well-written software can often lead an extraordinarily long and fruitful life (I am storing the fruits of two years of book research in a 2 megabyte Ecco Pro file, in outlines composed in a program that has not been upgraded or modified since around 1997), most software today begins to rot from the moment of first use.

And the most notorious piece of decay-prone software is the one Microsoft’s billions are founded on. Windows begins to accumulate barnacles of cruft in the registry the moment you first crank it up and try to use it to do anything. If you are a typical user, after two or three years of regular use your operating system will be grinding to a halt, crushed by the weight of the junk your various applications and Windows have together conspired to scatter across your directories. I know plenty of people who choose to buy a new computer not because they necessarily need some new hardware feature or upgrade but because they have given up on trying to save an ailing Windows installation — and reinstalling Windows is enough to send most people screaming toward the nearest Dell ad.

So while Carr may be technically correct — that software code does not “decay” the way a blade dulls — he is, by any pragmatic view, dead wrong. From the user’s perspective, software almost always decays — it stops doing what you want it to do, or you try to do something it is supposed to do and find that you can’t. The more you use it, the more likely it is to break, because the more likely you, as a cantankerous and unpredictable human being, are likely to do something the programmers haven’t imagined you would do.

Tim O’Reilly’s writings about software as a service outline this basic truth: Most software today “has people inside.” Software decay is so universal that every piece of software needs a corps of developers to keep the rot in check — whether you’re talking about a Web service like Amazon or Google, where the programmers deliver upgrades through the Web site; or a custom business application, where the programmers work for the software company or the client company; or a consumer application, where the programmers provide users with a constant stream of patches and upgrades to keep the bugs at bay.

This is a bad thing if you are dreaming of a world of perfect software. But it certainly keeps a lot of programmers employed. And it’s a more natural model for a world in which we don’t expect perfection, but hope for steady improvement.

Filed Under: Dreaming in Code, Software

The robot heart of software

July 19, 2004 by Scott Rosenberg

Isaac Asimov was one of the science fiction authors whose works I avidly consumed when I was in my early adolescence, and though even then I could tell that his writing lacked a certain level of nuance and style, I loved it for its cleverness and its imagination. Standing at the podium at science fiction conventions, expounding on any subject under the sun, he was like a polymath Woody Allen with the neurosis circuits disabled, and his optimistic rationalism — even in the 1970s, an era during which optimism was hard to make credible — was infectious. (Read Cory Doctorow’s appreciation of Asimov in Wired for more.)

So I don’t think I’ll be able to bear going to see the new movie “inspired by” his “I, Robot” stories — those inventive chestnuts about what happens when robots programmed with “the three laws of robotics” tangle with the chaos of human affairs. (Chris Suellentrop in Slate offers an overview of how the movie betrays Asimov that makes me feel my decision is completely logical.) But I was glad to read this editorial in the Sunday New York Times, which thoughtfully nailed exactly what made these stories such fun:

  Each of the stories in “I, Robot” works out a problem in the application of these laws, usually caused by an unforeseen implication or contradiction. Asimov’s robots are perfectly logical, and therefore all the real problems are caused by humans, who are shockingly unaware of the way their intentions and emotions run counter to logic. What look like manufacturing flaws in the robots nearly always turn out to be faults in the way a command was articulated. Humans, it turns out, are mainly good at bossing other humans around. Our computers remind us of this every day.

The “I, Robot” stories, in other words, are exercises in logical debugging that happen to take the form of miniature mysteries.

Saying “the real problems are caused by humans” is, of course, awfully close to saying, “It’s the user’s fault!” — an excuse that conscientious software developers and designers shun. Yet, as I dig deeper into work on my book about software, I’m learning a lot more about exactly how hard it is to make the absolute logic of computing serve the messy ambiguities of human desire, when all the pressure of the undertaking is to make things work the other way — to force us human beings to conform to the rigorous precision of machines. Asimov’s wonderful stories pre-imagined this dilemma for us. Maybe someday he’ll find a filmmaker who can do his particular imagination justice.

Filed Under: Culture, Dreaming in Code, Software

« Previous Page
Next Page »