A common element in some recent man-made or man-assisted disasters has been the failure of risk models to predict the disaster. The subject of risk models is a bit complicated, so let’s start with a simple definition.
Many events are caused by not one thing, but by a bunch of things. For example, position of the earth, humidity, wind, levels of CO2 and air particulates, conditions on the ground and a host of other factors that can affect whether it’s going to rain, or how much it’s going to rain.
Essentially, a risk model tells you how likely something is going to happen. You set up your model as a set of mathematical equations that define measurable factors. You then do little thought experiments: What if we raise the temperature? What if we assume that local factories and cars are spewing out more CO2? What if we assume that it doesn’t rain for five years? Changing each of those factors would give us a different answer.
Engineers, insurance companies, economists, businesses and scientists all use risk models to predict how likely something is going to happen based on the most likely or common numbers for each factor. They then predict future behavior by changing these variables: How many sales will we lose if we raise the price of this new product? What will happen to traffic patterns if we double the size of the store we’re building? How will putting a tax on sweetened drinks affect consumption and tax revenues? What is the likelihood that Iran will have an atom bomb by 2015? Will it cost more to fix the very minor flaw in this piece of equipment or to defend the small number of lawsuits we can expect if we don’t fix it?
But here are some recent huge risk model failures:
- A risk model was used to measure the likelihood of an oil spill occurring when using the technologies, equipment and maintenance schedule BP used on its oil rig that is currently spilling about 19,000 gallons of oil into the Gulf of Mexico on a daily basis. That risk model said that such an accident was highly unlikely.
- Risk models were used to package, sell and buy the sophisticated synthetic investments that sent the economy into the toilet two years ago. These models said that it was almost impossible for certain combinations of investments to fail, and when they did, large banks failed or almost failed.
- Risk models helped state and federal budgeters for years decide they could put off fixing the New Orleans levies for yet another year. Other risk models said that the levies were safe enough to handle even the rarest of storms. These risk models proved wrong when the rarest of storms, Ms. Katrina, paid a visit to the Gulf coast.
Why did these risk models fail? We of course have to entertain the possibility that some situations are just too complicated for prediction, but in many cases, it’s not the model, it’s the numbers that people crunch into the model. If you put overly optimistic predictions in, you will get an overly optimistic result. We saw an example of optimistic prediction in western Pennsylvania a few years back when a local university put its name on a study that concluded that passage of a funding referendum that included building new football and baseball stadiums would lead to the creation of thousands of jobs. But when you dug deep into the study, the conclusions were based on assumptions that every new industrial park funded by the referendum would fill up to capacity in townships which had seen population and business losses for more than 20 years. The assumptions were optimistic, so the prediction of the model was bound to fail.
There are plenty of indications that in the three examples of recent man-made or man-assisted disasters that I gave, people involved were too optimistic in their predictions. And why? Because by doing so, they were able to make money or save money on a short term basis. Every day engineers, economists and other people working with risk models get pressure from their clients to come up with the results the clients want. So they fudge on their estimates.
There’s nothing wrong with risk models, but there are many old expressions that cover what can happen when they are misused. The acronym GIGO comes to mind—“garbage in, garbage out.” And Mark Twain once said, “Figures never lie, but liars figure.”