Whenever there is an article about software failure, there is a quotation from the venerable CHAOS Report — a survey by a Massachusetts-based consultancy called the Standish Group, first conducted in 1994 and regularly updated since. The CHAOS Report presented dire statistics about the high failure rate of software projects: 31.1 percent of projects cancelled, 52.7 percent “challenged” (completed only way over budget and/or behind schedule), and only 16.2% deemed a success.
There aren’t a whole lot of other statistics out there on this topic, so the numbers from Standish get big play. I used them myself in my book proposal, and returned to the report as I researched the book, interested in finding out more about the methodology the researchers used — and also curious about what “CHAOS” actually stood for: Combinatorial Heuristic Algorithm for the Observation of Software? Combine Honnete Ober Advancer…?
Nope. As far as I could tell, CHAOS is an acronym for nothing at all. I tried to contact Standish for more information by telephone and e-mail but they never responded. It wasn’t essential to my work — there’s only a handful of sentences on the subject in Dreaming in Code — so I didn’t push hard. I thought, maybe this was the sort of consultancy that was only interested in the paying customers.
Several researchers, interested in pursuing the origins of this key data, have contacted Standish and asked for a description of their research process, a summary of their latest findings, and in general a scholarly discussion of the validity of the findings. They raise those issues because most research studies conducted by academic and industry researchers arrive at data largely inconsistent with the Standish findings….
Repeatedly, those researchers who have queried Standish have been rebuffed in their quest….
Glass is a widely known and respected authority on the software development process (I read, and can recommend, his book Facts and Fallacies of Software Engineering as part of my book research), and the Communications of the ACM is the centerpiece journal of the computing field’s main professional organization. So maybe the Standish group will respond to the plea with which Glass closes his column:
it is important to note that all attempts to contact Standish about this issue, to get to the heart of this critical matter, have been unsuccessful. Here, in this column, I would like to renew that line of inquiry. Standish, please tell us whether the data we have all been quoting for more than a decade really means what some have been saying it means. It is too important a topic to have such a high degree of uncertainty associated with it.
Indeed. The Standish numbers are precisely the sort of statistic that journalists in need of background “facts” — and scholars, too, for that matter — will quote in an endless loop of repetition, like Newsweek’s infamous stats showing that thirtysomething women were more likely to be killed by terrorists than to find husbands. The loop keeps repeating until someone provides a definitive debunking — and even then it doesn’t always stop.
So it’s ironic but hardly surprising to find the same magazine that contains Glass’s complaint also featuring a cover story on “The Changing Software Engineering Paradigm” that parrots the Standish numbers for the umpteenth time.
[tags]dreaming in code, software development, software failures, chaos report[/tags]
There are no revisions for this post.