I listened to this interview yesterday with BP director Robert Dudley on the News Hour:
ROBERT DUDLEY: …The blowout preventers are something that are used on oil and gas wells all over the world, every well. They just are designed not to fail with multiple failsafe systems. That has failed. So, we have a crisis.
…JEFFREY BROWN: Excuse me, but the — the technology — the unexpected happened. And so the question that you keep hearing over and over again is, why wasn’t there a plan for a worst-case scenario, which appears to have happened?
ROBERT DUDLEY: Blowout preventers are designed not to fail. They have connections with the rig that can close them. When there’s a disconnection with the rig, they close, and they’re also designed to be able to manually go down with robots and intervene and close them. Those three steps, for whatever reason, failed in this case. It’s unprecedented. We need to understand why and how that happened.
The failsafe failed. It always does. “Designed not to fail” can never mean “certain not to fail.” There is no such thing as “failsafe” — just different degrees of risk management, different choices about how much money to spend to reduce the likelihood of disaster, which can never entirely be eliminated.
Two different social attitudes conspire to lead us to disasters like the Gulf spill. On the one hand, there is the understandable but naive demand on the part of the public and its proxies in the media for certainty: How can we be sure that this never happens again? Sorry, we can’t. If we want to drill for oil we should assume that there will be spills. If we don’t like spills, we should figure out other ways to supply our energy.
On the other side, there is what I’d call the arrogance of the engineering mindset: the willingness to push limits — to drill deeper, to dam higher — with a certain reckless confidence that our imperfect minds and hands can handle whatever failures they cause.
Put these two together and you have, rather than any sort of “failsafe,” a dynamic of guaranteed failure. The public demands the impossibility of “failsafe” systems; the engineers claim to provide them; and everything is great until the inevitable failure. Each new failure inspires the engineers to redouble their efforts to achieve the elusive failsafe solution, which lulls the public into thinking that there will never be another disaster, until there is.
I wrote about these issues as they relate to software in Dreaming in Code. But at some point the need to understand this cycle demands a more artistic response.
May I suggest you give a listen to Frank Black’s “St. Francis Dam Disaster,” a great modern folksong about a colossal engineering failure of a different era.Related
- 20 May, 2010 @ 10:20 [Current Revision] by Scott Rosenberg
- 20 May, 2010 @ 10:18 by Scott Rosenberg