The underwater camera that gave the world a picture window on the massive oil spill resulting from the failure of BP’s Deepwater Horizon rig no longer shows a black plume spreading into the sea. The massive oil leak that so dominated the news for months, it seems, was finally stopped.
But the questions about this disaster—how it happened, how it was handled, what we can learn from it—figure to linger for years to come.
For more than 25 years, the Wharton Risk Management and Decision Processes Center has been studying low-probability, high-consequence events like the Gulf oil spill, and how we might do a better job managing them. Late this summer, its two Co-Directors sat down with Wharton Magazine to discuss the spill, as well as how BP and the U.S. government responded.
Howard Kunreuther is the Cecilia Yen Koo Professor of Decision Sciences and Business and Public Policy. Robert Meyer is the Gayfryd Steinberg Professor of Marketing. Both are co-directors of the Wharton Risk Center.
When a catastrophe like the oil spill in the Gulf of Mexico occurs, you are uniquely well prepared to think about how organizations perform. Given what you know, what do you think of the way BP handled this situation?
Robert Meyer: When a company is large and complex and dealing with a high-risk business such as oil drilling, negative events will occasionally happen. The key critique of BP in this case is not that the spill occurred, but why it occurred—that it was a foreseeable consequence of a culture that tolerated the tradeoff of safety for expedience in many of its operations. What made the disaster all the more puzzling was that it occurred at the same time that BP was still paying for a previous string of self-made disasters: a refinery explosion in Texas City in 2005, the 2006 Prudhoe Bay spill, and the Thunder Horse rig that almost went down in the Gulf in 2005. One would think that these events would have induced BP to undertake changes in how they approached risk management that might have precluded the Deepwater Horizon sinking, but they apparently did not. BP would not have been blamed nearly as much if the spill was caused by external factors that could not have reasonably been anticipated, such as a novel terrorist attack. But this was not the case here; it was an event that BP brought upon itself in the face of recent evidence that something may be amiss in its operations.
Prof. Kunreuther, do you share Prof. Meyer’s concern that BP appears to have missed the lessons of its past?
Howard Kunreuther: I certainly share Bob’s view that there should have been evidence from the past that raised cause for concern. I think what often happens in these situations where there are low-probability events is that one tends to focus on the fact that BP has been successful drilling in the past, even though they have had some disasters.
In many situations, organizations have a tendency to believe that if an event is below a certain threshold level of concern that it’s not going to happen to them. They also recognize that if they don’t drill, they may be losing out with respect to their own profitability, and this may dominate the firm’s thinking. I don’t have any definitive answers as to what BP actually did in this particular case, but I want to give you an analogy involving NASA, which demonstrates very similar behavior. If you look at the Challenger and the Columbia space shuttle disasters, in both cases there were warnings before the launches of potential problems. In the case of the Challenger, Morton-Thiokol had raised questions regarding problems with the O-rings and other dangers of flying the Challenger in cold weather. In the case of the Columbia disaster, Elizabeth Paté-Cornell, a risk analyst from Stanford, had undertaken a study indicating potential tile problems prior to the accident. In both instances, NASA ignored these warnings with the hope that a disaster was not going to happen even though the risk analyses suggested they should have paid attention.
Meyer: This indicates how bad companies can be—or actually individuals as well—at learning from negative events. For example, BP’s Thunder Horse platform rig almost sank during Hurricane Dennis in 2005 in the Gulf of Mexico. It turned out afterwards that the reason for the near-disaster wasn’t the hurricane, but rather a small mistake BP made in a pipe fitting on the platform. The fact that the platform was able to survive a hurricane and their own construction mistakes may have taught BP the wrong lesson, that these platforms are pretty robust devices, capable of withstanding an awful lot of stress, hurricanes and whatever. That belief, in turn, may have encouraged them to take accept more risk. Hence, ironically, the fact that the Thunder Horse platform survived may have increased the odds that the Deepwater Horizon would not.
How do companies manage around that? If we were to take precautions against every risk we saw, then many things would never be done.
Kunreuther: This is a key question that you are posing as to whether organizations can do this on their own, or whether we really need some public/private partnerships. If a certain company is the only one that’s taking precautions, then they will suffer when no disaster occurs, because they are incurring the costs of protective measures but not getting any of the benefits. With respect to the financial crisis, many firms were aware that they were operating in a manner that could have severe financial repercussions. At the same time, they felt that if they were too cautious and did not participate, they might actually lose an opportunity to reap large potential profits. Very well-enforced regulations that everyone understands can create a level playing field in these situations. We know from the BP disaster that the regulatory agency, the Minerals Management Service, has been criticized for its lack of oversight of the oil industry. So one answer to your question is: Enact regulations and make sure that everyone understands that they are there for good reasons. The regulations need to be well-enforced and the penalties need to be sufficiently severe that the regulations are adhered to.
Meyer: I think it’s worth mentioning that in this case we’re not talking about an event that was unimaginable or unforeseeable, such as an asteroid hit. The Alaska spill, the Texas City refinery explosion and this incident are situations where the reason for the mishap was the failure of the company to follow its own procedures, to cut costs and so forth. That has been the basis for the fines that have been levied. In all of the cases, the company had admitted to mistakes. In each instance, the company recognized very real risks, and put procedures in place to mitigate the risks, yet they chose to set aside those procedures and took on riskier behavior. Just like Howard said, it could be a “well, everybody’s doing it” type of thing, and if you don’t take risks then you could fall behind.
Can you talk about cost-benefit analysis? How might a management team weigh those costs?
Kunreuther: It would be very interesting to know whether or not BP did any kind of cost-benefit analysis (CBA). The question that we always struggle with at the Wharton Risk Center is how do you convince companies to do a formal analysis such as CBA and what should be included. It isn’t just the cost of developing a protective measure—they also have to think about the reputation of the company and other long-run considerations. The firm might want to address a question such as “If we happen to have an accident like this, what are the consequences likely to be?” And we mean not just the fines and lawsuits that Bob was referring to following BP’s other disasters, but also the reputation of the company—how people will feel about BP in the future. My guess is that BP did not do a very systematic cost-benefit analysis.
Meyer: We mentioned that a culture of safety comes from the top. There’s also an issue of who does the cost-benefit analysis. BP is a very large, highly decentralized company with a CEO in London and the people who have to live with the consequences of poor decisions living thousands of miles away. There is always the danger that this remoteness makes the firm more willing to accept risk; one wonders if BP would have had different policies if the Deepwater Horizon was drilling in the Thames.
Is there a company you could point to that deals well with these issues?
Kunreuther: The Wharton Risk Center has done a lot of work with the chemical industry over the years. The chemical industry has been very concerned about accidents. DuPont is an example. Their motto is: Safety is our most important product. They were in the dynamite business and they knew they were dealing with something that was dangerous. The chemical industry has been quite concerned about developing systematic risk analyses to deal with particular problems and take steps to reduce the likelihood of catastrophes.
There will still be severe disasters, notably the Bhopal accident that occurred in 1984 in India. That crisis pushed the industry to take stock of their risk management strategies with respect to catastrophic accidents. We actually did several studies here at the Center to understand the decision processes of a chemical company with respect to this disaster. The firm had taken a systematic approach to safety before the Bhopal disaster, but they were even more concerned about what protective measures to take after the disaster occurred. They spent considerable time examining the likelihood of certain accidents occurring and ways to reduce these risks. It would be interesting to learn whether there is a similar culture in the oil industry.
What are the lessons that the oil industry could learn from the Deepwater spill?
Meyer: One hopes it will underscore the need for improved enterprise risk-management processes. If nothing else, we hope it teaches that seemingly small decisions to accept slightly elevated levels of risk can produce catastrophic outcomes that could threaten the viability of an entire company.
Kunreuther: I want to come back to the point that Bob made earlier, that it is interesting that BP had these prior disasters, and from all indications, did not learn from them—or didn’t learn enough. This Gulf Coast oil spill is much more severe than previous spills, so I would certainly agree that BP and other companies are likely to take some steps to reduce risks in the future. But I think the question is how long will their cautionary behavior last? In other words, if there are no disasters for a while, there is a tendency to forget and become complacent. Firms and individuals tend to think, “We’re putting in all this money on safety and we’re not getting anything back—we’ve invested a lot, and it looks like we are safer than we need to be.”
We have discovered analogous behavior with respect to insurance. People often purchase insurance after a disaster—not before the event. Then they frequently cancel their policies three or four years later when they haven’t filed a claim, because they conclude it’s really a bad investment. “We haven’t collected at all, and we haven’t had a flood or a hurricane, so why should we continue to purchase this insurance?” You cannot convince them that the best return is no return at all. They should celebrate not having had a loss, but they don’t.
I think firms are more sophisticated than individuals about risk management, but there’s still a tendency to forget past disasters. Go back to the Challenger/Columbia example. The Challenger should have been a red flag for NASA, yet the Columbia accident occurred. More generally, there are warnings today suggesting we may have to construct a regulatory system or some kind of public involvement that can provide the appropriate protection for the long term.
It appears that the agency charged with regulating the oil industry will be restructured. What will it take to assure that regulations will be enforced effectively in the future?
Meyer: I think that there’s got to be an understanding within the organization that the regulatory system is not the enemy. The philosophy of the company can’t be, “What’s the least we can do to make sure that we escape an OSHA fine?” Companies must recognize that regulations are for their own benefit and not in any sense punitive.
Kunreuther: If we’re going to have these regulations, and claim that they will be well-enforced, the public sector agency has to mean it. The regulatory agency needs to have the personnel to undertake third-party inspections and do the necessary audits. Otherwise, there will be a tendency by firms to cut corners.
The analogy that I often use when talking about the importance of well-enforced regulations is the parking meter problem. Do you put your money in the parking meter if you know that there’s a relatively small chance that you’re going to be caught? You may decide that it’s worth taking your chances. Now, if the fine is large enough, or if you recognize that there will be severe repercussions —along the lines of what we’ve had following the Gulf Coast oil spill— that may be enough to spur action. A manager might say: “I’m going to take precautions because the consequences are sufficiently large.” But if you feel the penalties are not large, you might decide that it’s worth taking chances.
What about the possibility of a legislative backlash in the form of too many regulations? If what we have now doesn’t work very well and a regulatory environment with good enforcement is needed to create a level playing field, how do we achieve that?
Meyer: I think the danger is banning or putting great restrictions on offshore drilling. For anybody who lives in the states around the Gulf of Mexico and Texas and Louisiana that’s the fear, because oil represents such a major part of their economy. One would hope that the lesson from this, from the government side, is that not only does there have to be stringent regulations in place—and there already are some—but also strict enforcement.
Kunreuther: I agree with Bob that the big danger is that we respond by preventing some activities that may yield net benefits to society. There may be an opportunity here to try to do analysis to find out what the benefits are if you have well-enforced regulations, if you do take protective measures and take steps beforehand. What would the consequences be for the environment as well as the economy?
I think one way to view this disaster from a positive perspective is to ask if Congress and regulatory agencies and the Obama Administration can use this as an opportunity to rethink our long-term strategy with respect to dealing with problems associated with climate change. We can ask what will be the short- and long-term consequences of a variety of different strategies rather than reacting precipitously, as we often do in crises, with regulations that are too stringent.
Meyer: I would also like to say, within companies it really has to start with the top, because you could go ahead and set all sorts of regulations and fines, but unless the company really recognizes that safety and following regulations is a top priority, it’s just not going to happen. The prospect of a fine and the kind of consequences suffered at Texas City were insufficient for BP as a whole to adopt a culture that says, “We’re not going to cut corners. We’re going to follow regulations.” It has to start at the very top within the leadership of the company. We cannot just hope that somehow or other, the government will come in and pass regulations and make them do things they don’t want to do.
But what needs to be done in order to create that culture within the top leadership? Is it a matter of incentives or do you just hope that principled leaders will rise, or what?
Kunreuther: We need to find role models as to what companies have done or not done to reduce risks from low-probability, high-consequence events. Over the next year the Wharton Risk Center will be addressing this question: What do we mean by good leadership with respect to dealing with extreme events? We are undertaking this study with our colleague, Mike Useem, who heads the Center for Leadership and Change Management, and will interview managers in organizations that have dealt with catastrophic risk in an effective manner, as well as some that have not done a good job. One finding from past studies is that individuals tend to be short-term oriented. We focus on how to deal with the next period or next year, rather than thinking about the long run. As Bob was saying, you have to start at the top. We need to get leaders to say that here’s a strategy for the company—we’re in this for the long run. How do we develop appropriate incentive systems to encourage managers to behave in a manner that will satisfy these long-run objectives? Managers will probably need short-term rewards to encourage them to think long-term, at the same time understanding that part of their compensation will be contingent on what will happen over the next few years.