Wordyard

Hand-forged posts since 2002

Scott Rosenberg

  • About
  • Greatest hits

Archives

Wordyard / Dreaming in Code / Standish’s CHAOS Report and the software crisis

Standish’s CHAOS Report and the software crisis

August 2, 2006 by Scott Rosenberg 3 Comments

Whenever there is an article about software failure, there is a quotation from the venerable CHAOS Report — a survey by a Massachusetts-based consultancy called the Standish Group, first conducted in 1994 and regularly updated since. The CHAOS Report presented dire statistics about the high failure rate of software projects: 31.1 percent of projects cancelled, 52.7 percent “challenged” (completed only way over budget and/or behind schedule), and only 16.2% deemed a success.

There aren’t a whole lot of other statistics out there on this topic, so the numbers from Standish get big play. I used them myself in my book proposal, and returned to the report as I researched the book, interested in finding out more about the methodology the researchers used — and also curious about what “CHAOS” actually stood for: Combinatorial Heuristic Algorithm for the Observation of Software? Combine Honnete Ober Advancer…?

Nope. As far as I could tell, CHAOS is an acronym for nothing at all. I tried to contact Standish for more information by telephone and e-mail but they never responded. It wasn’t essential to my work — there’s only a handful of sentences on the subject in Dreaming in Code — so I didn’t push hard. I thought, maybe this was the sort of consultancy that was only interested in the paying customers.

In the August issue of Communications of the ACM, Robert Glass has a column about Standish and the CHAOS Report that suggests my failure to get a response from this organization was hardly unique.

Several researchers, interested in pursuing the origins of this key data, have contacted Standish and asked for a description of their research process, a summary of their latest findings, and in general a scholarly discussion of the validity of the findings. They raise those issues because most research studies conducted by academic and industry researchers arrive at data largely inconsistent with the Standish findings….
Repeatedly, those researchers who have queried Standish have been rebuffed in their quest….

Glass is a widely known and respected authority on the software development process (I read, and can recommend, his book Facts and Fallacies of Software Engineering as part of my book research), and the Communications of the ACM is the centerpiece journal of the computing field’s main professional organization. So maybe the Standish group will respond to the plea with which Glass closes his column:

it is important to note that all attempts to contact Standish about this issue, to get to the heart of this critical matter, have been unsuccessful. Here, in this column, I would like to renew that line of inquiry. Standish, please tell us whether the data we have all been quoting for more than a decade really means what some have been saying it means. It is too important a topic to have such a high degree of uncertainty associated with it.

Indeed. The Standish numbers are precisely the sort of statistic that journalists in need of background “facts” — and scholars, too, for that matter — will quote in an endless loop of repetition, like Newsweek’s infamous stats showing that thirtysomething women were more likely to be killed by terrorists than to find husbands. The loop keeps repeating until someone provides a definitive debunking — and even then it doesn’t always stop.

So it’s ironic but hardly surprising to find the same magazine that contains Glass’s complaint also featuring a cover story on “The Changing Software Engineering Paradigm” that parrots the Standish numbers for the umpteenth time.
[tags]dreaming in code, software development, software failures, chaos report[/tags]

Post Revisions:

There are no revisions for this post.

Filed Under: Dreaming in Code, Software, Technology

Comments

  1. Deborah Hartmann

    August 25, 2006 at 3:15 pm

    Hello, I thought you might like to read Standish Group’s Jim Johnson in interview.

    I interviewed Jim this week, after reading the Robert Glass column in Communications of the ACM. I, too, wanted to know if I’d been misusing their stats :-) so I asked. Jim was quite forthcoming, I hope you enjoy the article.

    Interview: Jim Johnson of the Standish Group
    http://www.infoq.com/articles/Interview-Johnson-Standish-CHAOS

    ciao!
    deb

    Deborah Hartmann
    Agile Community Editor
    http://www.InfoQ.com

  2. Lee Fischman

    May 8, 2009 at 12:07 pm

    Since the CHAOS results are so controversial, I’ve decided a grass roots effort might be interesting. I created a single-question survey here:

    http://www.surveymonkey.com/s.aspx?sm=oOq7Hzgz6BCYZfgZoNC72w_3d_3d

    And will be posting results here:

    http://swprojectsurvey.blogspot.com/

Trackbacks

  1. 8 Project Success Factors Every Successful PM Knows - Project Manager News says:
    March 9, 2020 at 5:17 pm

    […] Chaos Report from 2006 says that 45% of features built are never used and that only 20% of features were used often or always. So, even though you didn’t get everything your customers wanted, ask if you have still met their requirements. If not, what didn’t you do? What features are critical? Assessing what stakeholders are and are not happy with helps measure overall success. […]

Leave a Reply

Your email address will not be published. Required fields are marked *