It’s been a while since I quoted from The Black Swan on this blog. I presented this book at the September, 2007 First Friday Book Synopsis. It still pops up in the “let’s discuss” and even some “best-seller” lists. Here’s my description from my handout:
• A Black Swan is an event which follows three attributes:
• First, it is an outlier. (rarity)
• Second, it carries an extreme impact.
• Third, in spite of its outlier status, human nature makes us concoct explanations for its occurrence after the fact, making it explainable and predictable. (retrospective, though not prospective, predictability).
In the New York Times Magazine this past Sunday, we find this article: Spillonomics: Underestimating Risk by David Leonhardt. Here’s an excerpt:
When an event is difficult to imagine, we tend to underestimate its likelihood. This is the proverbial black swan. Most of the people running Deepwater Horizon probably never had a rig explode on them. So they assumed it would not happen, at least not to them.
Similarly, Ben Bernanke and Alan Greenspan liked to argue, not so long ago, that the national real estate market was not in a bubble because it had never been in one before. Wall Street traders took the same view and built mathematical models that did not allow for the possibility that house prices would decline. And many home buyers signed up for unaffordable mortgages, believing they could refinance or sell the house once its price rose. That’s what house prices did, it seemed.
The point of the article is fairly clear: if we can’t imagine that something will actually happen, then we think it won’t ever happen. It does not matter that someone somewhere has said, “this might really happen.” If it’s beyond what we think can happen/will happen, then we act as though it will never happen.
Big mistake. Big mistake!
“No one could have imagined them taking a plane, slamming it into the Pentagon into the World Trade Center, using planes as a missile.”
As I said to you in the private session, I probably should have said, “I could not have imagined,” because within two days, people started to come to me and say, “Oh, but there were these reports in 1998 and 1999. The intelligence community did look at information about this.”
To the best of my knowledge, Mr. Chairman, this kind of analysis about the use of airplanes as weapons actually was never briefed to us.
I cannot tell you that there might not have been a report here or a report there that reached somebody in our midst.
I think that Ms. Rice was being very truthful – in spite of the fact that there had been plenty of “imaginings’ of this kind of attack. Tom Clancy had imagined something similar in a best-selling novel. There had been some reports discussing the possibility of just such an attack. Whether Ms. Rice had read the novel, or heard about the reports, was beside the point. Since it had never happened, it “never could happen,” – at least, to our thinking. This is the black swan problem.
The oil rig disaster, the subprime mortgage meltdown, are events that should tell us all—“we’d best expect the next unimagined possibility to actually happen at some point.”
It sort of reminds me of the opening paragraphs of M. Scott Peck’s The Road Less Traveled:
Life is difficult.
This is a great truth, one of the greatest truths. It is a great truth because once we truly see this truth, we transcend. It. Once we truly acknowledge that life is difficult – once we truly understand and accept it – then life is no longer difficult. Becasue once it is accepted, the fact that life is difficult no longer matters.
Most do not fully see this truth that life is difficult. Instead they moan more or less incessantly….
Discipline is the basic set of tools we require to solve life’s problems.
Without discipline we can solve nothing.
So, I think we need to learn to think this way:
The Black Swan, the impossible to believe it will actually happen bad event, will probably happen. The worst case scenario is very likely to happen – if not this time, then soon. Let’s prepare for it; let’s be ready for it; let’s not be surprised when it happens — because it will happen. The black swan will visit our company, our project, our life – and when it does, we should not be surprised.
Update: I just read this article, Countervailing power: After the BP catastrophe and financial market collapse: Taking back the sway big business has over our government by Robert Kuttner, which adds a few elements to the conversation. I do not think it takes away from my point — but it does help us understand why the people who are saying “this could happen” are not listened to very well.
More than half a century ago, the late economist John Kenneth Galbraith coined an important concept — “countervailing power.’’ Big business, Galbraith observed, had immense economic influence. But countervailing forces such as the trade union movement or activist citizens groups could neutralize that economic power by harnessing government to keep business’s less savory tendencies from overpowering its benign ones.
But that was then. Despite a seemingly formidable environmentalist movement, the oil industry overwhelmed its regulators. Americans for Financial Reform, the coalition of consumer groups pushing for better banking regulation, is outspent by Wall Street lobbyists by at least 100 to one.
There has been a lot of commentary lately contending that we have a tendency to underestimate risk. Truly catastrophic events occur only rarely — they are “black swans.’’ In the meantime, a lot of money can be made by betting that disaster won’t occur, or that it will occur on somebody else’s watch.
But who, in this account, is “we’’? In fact, plenty of voices in the wilderness were warning against the risk of a catastrophic oil blowout, or a financial one. These critics did not lack prescience or insight. What they lacked was political power.
It’s true that technologies, both financial and oleaginous, are becoming ever more complex; and this does create new kinds of risk. But the cure is less technical than political.
Citizens need to act more vigorously to restore Galbraith’s countervailing power. Otherwise, private business acting in its short run self interest will ruin us twice — once when private markets pay no heed to the risks they are imposing, and a second time when they corrupt our regulatory institutions.
So far, engineers have been unable to seal the leaks that were discovered after the April 20 explosion of the Deepwater Horizon rig. Eleven rig workers are missing and presumed dead. Crude is leaking into the Gulf from three breaches in a pipe called a riser that once ran to the rig from the well under almost a mile of water. (from the Houston Chronicle, here).
The offshore drilling rigs off the coast of the United States are not required to include an “acoustic switch,” which can be triggered even from a lifeboat…
This Remote Activated Device, called an acoustic switch, is considered a weapon of last resort when it comes to sealing off ruptured offshore wells, but isn’t required on platforms operating under U.S. laws.
The Acoustic Switch’ is used by other industrialized nations, but not considered mandatory by U.S. Regulations. (from Reuters, here).
So here’s the thing. One of these days, something will go wrong. Really wrong. And the bigger the damage that can be caused when it does go wrong, the more important that there be back up systems, and then back up systems for failed back up systems.
And the reason we are still arguing over government regulations (safety, financial, and other) is that many companies will try to get by with the least amount of expense “required by law.” So when they can keep some requirement out of the law, then they can get by with the lesser expense.
Until that really big thing goes wrong.
So – we come to the great oil spill, following the explosion that cost 11 people their lives, of April, 2010. It may do little good to cast blame. But it does a lot of good to ask “could the oil leakage have been stopped?” And the answer might be yes. Yes, I said might. But – I think that BP wishes they had installed that $500,000 acoustic switch to see if it might have worked.
How much oil are we talking about making its way to the shoreline of our country? No one knows for sure, but the experts have increased the estimate of the oil leaking out at least four times, and the current estimate could result in this spill potentially matching the 11 million gallons of the Exxon Valdez loss. And the worst case scenario could rise much, much higher if they cannot stop the leak soon.
So, here are some lessons for people making decisions in business. And though I am thinking of the big decisions, where the consequences of something going wrong can be massive, the lessons might be valuable for us all. (Have you ever seen a speaker not be able to get his/her technology working properly? Have you ever been that speaker?)
1. Expect, and plan for, the worst case scenario. Because it really, really might happen. It only took one Exxon Valdez for the environment of the Prince William Sound’s Bligh Reef, and surrounding areas, to be damaged for a very, very long time.
2. Budget for every possible back up system. Redundancy in back up systems was a necessity for NASA. Regarding this explosion and aftermath, I heard one expert describe how the back up systems all failed (yes, they had more than one) – a truly rare occurrence, according to this expert. But now that we know that there was one more that could have been included in the construction, don’t we all wish (from BP, down to all of us) that they had spent the extra $500,000 dollars to put it in place and give it a chance to work?
3. Maybe the biggest lesson — regarding those folks who try to persuade us to let them build big projects, and they say “we are confident that nothing will go wrong” – don’t believe them! They mean well. They are not intentionally misleading us. But – they really can’t promise that nothing will go wrong. We learned that one the hard way – again.
This is truly a tragedy, with the loss of human life, and a crisis — with the threat to the environment, the food chain, jobs along the coast – of monumental proportions. I hope we learn the big lessons.