Feeds:
Posts
Comments

Posts Tagged ‘unrocked boat’

This slide presented by Dr. Mark Fleming during his excellent presentation in Vancouver piqued my interest, so I looked for a bit more information.  I found this interesting observation in a paper by Gonzales and Sawicka:

The role of risk perception is particularly interesting. First, performance in both safety and security settings is well characterized by the “unrocked boat” metaphor: Organizations become accustomed to their apparently safe state, thus misperceiving risk and allowing themselves to drift into regions of greater vulnerability, until (near) accidents temporarily induce greater risk awareness. The resulting pattern is oscillatory, with varying amplitude and typically leading to disaster.

The above quote seems to describe the situation on the Deepwater Horizon. Perhaps there was a sense of invulnerability among some employees (including managers) and finishing the job took precedence over safety.  As Mark Fleming remarked in his presentation, offshore workers know their employer is in business to produce barrels of oil, not barrels of safety.  Concerns about production (or in this case timely suspension of the well) can easily supersede concerns about safety.

A very important paper by James Reason, the person responsible for the “Unrocked Boat” diagram, had this to say:

The same cultural drivers-time pressure, cost-cutting, indifference to hazards and the blinkered pursuit of commercial advantage-act to propel different people down the same error-provoking pathways to suffer the same kinds of accidents. Each organization gets the repeated accidents it deserves. Unless these drivers are changed and the local traps removed, the same accidents will continue to happen.

Reason goes on to recommend a data collection program that is currently absent, at least on an industry-wide basis:

In the absence of sufficient accidents to steer by, the only way to sustain a level of intelligent and respectful wariness is by creating a safety information system that collects, analyzes, and disseminates the knowledge gained from accidents, near misses, and other sources of ‘free lessons.’

I would suggest that another way to sustain wariness is to present information on past accidents and why they can happen again. How many industry employees know what happened at Santa Barbara, Bay Marchand, Main Pass 41, Ixtoc, the Alexander Kielland, Ocean Ranger, Brent B, South Pass 60 B, and even Piper Alpha?

Finally, Reason reaches this critically important and completely relevant conclusion (keep in mind that this paper is 12-years old):

It need not be necessary to suffer a corporate near-death experience before acknowledging the threat of operational dangers-though that does appear to have been the norm in the past. If we understand what comprises an informed culture, we can socially engineer its development. Achieving a safe culture does not have to be akin to a religious conversion-as it is sometimes represented. There is nothing mystical about it. It can be acquired through the day-to-day application of practical down-to-earth measures. Nor is safety culture a single entity. It is made up of a number of interacting elements, or ways of doing, thinking and managing, that have enhanced resistance to operational dangers as their natural by-product.

Read Full Post »