Deep Water Horizon, Chernobyl, San Bruno, Challenger, Three Mile Island. Normally these names do not need an explanation. It’s common knowledge that these locations stand for catastrophic events, which not only took lives in their devastating wake and had huge environmental consequences, but also changed whole industries.
They are called Rare Events.
Even though they are rare, all of them have things in common. One is: The false sense of security, because disasters like this have never happened before. Is that why they happened? That’s definitely one way to look at it. They happened even though they were statistically improbable, their occurrence would never have been predicted, and they are way beyond what we can imagine.
The Social Perspective
Jan Hayes, professor at RMIT University in Melbourne, Australia argues that rare events can always be attributed to social and organizational circumstances. The failure to adequately apply the knowledge that already exists, be it inadvertently, in good faith, or deliberately, leads to these rare events. With the San Bruno pipeline rupture in 2010 for example, investigators identified several misleading resource allocations prior to the incident as well as inept priority settings. Resulting in an entire suburb destroyed, eight people killed, and 59 injured. Almost similar to the disaster on the Deepwater Horizon in the same year, where 11 workers died and 7,000,000 tons of crude oil flowed into the Gulf of Mexico in a month-long spill. Reports showed that the spill resulted from human and technical failings, which are also the effect of systemic causes like cost-cutting decisions and inadequate safety systems.
Many theorists believe that the problem is not technology, but people. It is a chain of important mistakes made by people in critical situations involving complex technological and organizational systems. Most of the rare events were profoundly surprising to the companies concerned. However, from a broader perspective, they were no surprise at all because they originated from well-known human, organizational and regulatory failures.
Jan Hayes is sure: To prevent devastating catastrophes one has to understand the human, structural and organizational causes. Those rare events carry information that we usually already knew and therefore could use to prevent catastrophes from happening again.