A Catastrophic Accident of Normal Proportions
On June 8th, Houston Representative John Culberson released an open letter to President Obama, criticizing the administration’s six-month moratorium on deep water oil drilling. “I am concerned that the decision to impose the moratorium is based more on emotion than fact,” he wrote. “The Deepwater Horizon incident was a terrible human tragedy with devastating environmental consequences, but it must be viewed in the proper historical context as a statistical anomaly. The government’s own records show that since 1985, more than 7 billion barrels of oil have been produced in federal offshore waters with less than 0.001 percent spilled – a 99.999% record for clean operations.
That 25-year record of safety should not be ignored in the haste to respond to public discord.” 
He’s certainly right about the public discord. After intense pressure, BP has just yielded to demands it postpone paying corporate dividends . This, after a very public summons of company leaders to the White House—which has taken its own hits from both left  and right . An unofficial BP public relations Twitter feed  has been savaging the company for weeks (after BP asked its anonymous author to make it clearer this was not an official BP feed, the site’s bio, updated daily, now reads “We are not at all associated with Beyond Petroleum, the oil company that has been destroying the Gulf of Mexico for 53 days” ). And as for investors, shares of BP have plummeted—last Wednesday, they were trading at a low of $29 a share—less than half their value from before the spill .
But what about Culberson’s statistics? Through what historical context should we view the spill? Is it a “statistical anomaly?” Put another way, how do we define the risks associated with deep offshore oil drilling?
Over the past few weeks observing the Gulf spill, I’ve been thinking a lot about Charles Perrow’s 1984 classic _Normal Accidents_ . Perrow argues that despite the usual efforts to implement safety regimes, certain kinds of “high-risk technologies” create “a form of accident that is inevitable” (3). He calls these accidents “normal,” in the sense that they are intrinsic to the nature of complex, technological systems like nuclear power plants, genetic engineering, and the shipping of toxic materials. No matter how carefully designed, no matter how intense worker training is, according to Perrow, certain kinds of very complex technological systems will always have accidents because no one can possibly foresee the various ways the parts of the system interact. For Perrow, normal or systems accidents “involve the unanticipated interaction of multiple failures” (70).
Thus Perrow’s argument contrasts the conventional way of understanding large accidents with a more novel one. According to Perrow, “[c]onventional explanations for accidents use notions such as operator error; faulty design or equipment; lack of attention to safety features; lack of operating experience; inadequately trained personnel; failure to use the most advanced technology; systems that are too big, underfinanced, or poorly run” (63). (In regards to the Gulf spill, watch for all of these explanations over the coming months.) _Normal Accidents_ posits instead that the very nature of large, complex technological systems makes accidents inevitable.
What makes normal accidents in these systems dangerous is the combination of two factors. First, their complexity. Simple, or what Perrow calls linear systems, are easy to understand—each part does one thing and it’s easy to imagine what would happen should any part of the system stop working. Complex systems, on the other hand, have non-linear interactions—parts serve multiple purposes, and the way one part relates to another (or another and another) is not always clear, and indeed, for Perrow, can never be known absolutely.
Second, there’s the degree of coupling—how much one part of the system influences another part. In loosely coupled systems, there’s lots of time between changes in one part of the system and the effects of that change on another (and thus time to think about what’s going on); the order in which things happen is more variable; the design is more flexible; and problems can be addressed without bringing down the whole system. In contrast, in a tightly coupled system, there’s little time between changes in one part of the system and the effects of that change on another (and thus little time to think about what’s going on); the order of operations is largely fixed; the design itself is constrained; and problems can shut the whole system down.
The crux of the argument is that in complex, tightly coupled systems, accidents are inevitable, and there’s little time or ability to address them as they happen. And some of those systems—like nuclear power, and I think like deep offshore oil drilling—may have very good rates of reliability by any conventional measure (as suggested by Rep. Culberson), but the consequences of a system failure are so huge, we should be wary about relying on them in the first place. As Perrow summarizes, “[s]ystem accidents are uncommon, even rare; yet this is not all that reassuring, if they can produce catastrophes” (5). As the Washington Post’s David Weigel, who reported on Culberson’s letter, put it on his blog, “That 0.001% has been a bit of a problem, hasn’t it?” 
As Perrow notes, by understanding normal accidents “we might stop blaming the wrong people and the wrong factors, and stop trying to fix the systems in ways that only make them riskier” (4). In the case of deep water drilling, that might mean rejecting solutions that involve larger, more complex drilling rigs—rigs with more safeguards or more complex “blowout preventers” or offset wells drilled in tandem with production wells—these only give more opportunities to create unforeseen interactions and offer only the illusion of safety. Instead we might seek to hasten a shift to a more diversified and flexible portfolio of energy sources while simultaneously devising ways to mitigate demand.
Of course, easier said than done. But over the past forty years, too little has been done to address the problems of increased petroleum demand besides looking for oil in more remote places or using environmentally questionable methods of extracting additional oil or gas from long abandoned fields. Perhaps the investments in those offshore rigs might be better spent catalyzing the research and development of wind, solar, and other alternative energy sources.
The President has exclaimed (serenely, as is his style) that since the spill began, he’s been talking with experts “so I know who’s ass to kick” . While the President may soon find his leg getting tired, the framework of normal accidents suggests this narrow kind of accountability will ultimately be insufficient. The problem is in the nature of the system.
Peter Shulman is Assistant Professor of History at Case Western Reserve University. You can read more about the Gulf oil spill at his blog, http://www.crimesagainstclio.com/.
 Charles Perrow, Normal Accidents: Living with High-Risk
Technologies. Princeton: Princeton University Press, 1984.