The Ethical Lessons Of Deepwater Essay

Submitted By tom19831
Words: 3946
Pages: 16

The Ethical Lessons of Deepwater
For engineers, playing it safe is never the easy way out. Early December 2010 saw the release of two reports issued by groups tasked with deconstructing the deadly and devastating Deepwater Horizon Spill that occurred in the Gulf of Mexico.
The Deepwater Horizon Oil Spill and Offshore Drilling commission, appointed by President Obama, and the Deepwater Horizon Study Group (DHSG), formed by members of the Center for Catastrophic Risk Management (CCRM) at UC Berkeley, pointed toward many of the same failures.
The DHSG’s 60 university professors, accident investigators, petroleum engineers, social scientists, environmental advocates, and directors of research centers did go one step further, however, by directly linking mismanagement by the well’s owner, British Petroleum, with its drive for profit. "Analysis of the available evidence indicates that when given the opportunity to save time and money—and make money—tradeoffs were made for the certain thing—production—because there were perceived to be no downsides associated with the uncertain thing—failure caused by the lack of sufficient protection."

With oil and gas development in the deep waters of the Gulf, Arctic, and other new frontier areas set to continue, the DHSG also contends that the risks of such exploration and production pose "likelihoods and consequences of catastrophic failures, that are several orders of magnitude greater than previously confronted."
Pondering the Worst Case
The prospects of failures far more severe are chilling. Yet, Lehigh University Professor John Kenly Smith, a chemical engineer who specializes in the history of technology, believes that forcing stakeholders to ponder the absolute worst is the only way to grapple with what’s really at stake. "If you are going to work in an environment where it’s physically impossible to go down there and get your hands on the technology, you really have to think of the unthinkable and nobody wants to do that," says Smith. " I’ll bet every day on that platform there were engineers thinking, ‘If we have a blowout on this thing what will we do?’"
What have we learned in the months since the worst that could happen, in fact, did? Perhaps not much that’s new, says Smith, who believes some of the safety failures that led to the disaster stem from what’s all predictably human and imperfect in all of us. What’s also clear is that engineers who design and maintain complex systems are in a tough spot. Here, Smith cites a few lessons of the spill:
1. Numbers can be deceiving. "There’s tremendous pressure in the corporate and scientific worlds to convert uncertainty to risk," says Smith. Take an uncertainty, assign it a probability number then run it through a model to obtain data on how likely a failure might be. The problem, though, says Smith, who during his career in industry investigated a number of serious job-related accidents, is that "999 times, people get away with doing unsafe things, and it’s only the 1,000th time that something horrible happens."
2. Safety has to be hardwired into a firm’s SOP. Smith cites the success of companies like DuPont—the subject of a book he coauthored, Science and Corporate Strategy: DuPont R&D, 1902-1980—with rewarding teams with the best safety records. "You have to really drill it into people and create counterincentives that make them stop and say ‘Will I cost everyone their prize if I get hurt?’" A hard-core safety-first stand also can relieve the tension between line functions that bring in the money and the staff people (i.e., engineers who raise the red flags). This is where ethics come in, says Smith: "The staff functions and engineers need to have the clout to make themselves heard."
3. Simplicity has its virtues (i.e., technical controls can create a false sense of security). The jury is still out on why the Deepwater Horizon blowout preventer failed. Even if results of the investigations lead to future fixes, blind faith in technology