Probably the most frequently used definition of risk is this one:
Risk = the Probability of something happening X resulting Cost/Consequences
This definition is flawed because of two fundamental reasons, which the formula itself suggests very eloquently:
1. Estimation of probabilities of future events is very difficult (while it is considerably easier when talking of past events). Rare events have very low probabilities and these are extremely difficult to estimate due to the fact that the sample of available data is very small (what is the probability of an event similar to 9/11?). Since this factor multiplies the “cost” in the above equation it is of paramount importance.
2. Estimation of the costs/consequences of these events. This is most difficult. Even after a catastrophic event it is difficult to estimate the total damage and cost.
However, the most important flaw is hidden and it is conceptual. Imagine the following example which was discussed at a recent conference on the so-called Black Swans (i.e. very rare events with dramatic consequences). Suppose that in a certain portion of a motorway a radar is mounted with the intent of catching those who enjoy speeding. If you’re 20-30 km/h above the limit, you can expect a fine of around 100 Euros. It is known from statistics that on that portion of the motorway on average one driver out of ten gets caught. The above formula would suggest that anyone who is speeding is risking 1/10 X 100 = 10 Euros. This is of course senseless. If you get caught, you pay 100 Euros, if you don’t, you pay nothing. In other words, if you drive fast, you’re risking a 100 Euros note, not a 10 Euros one. Try arguing with a policeman that your fine should be 10 Euros! Suppose that drivers are obliged to carry on board a regulatory capital for fines (something similar to Basel II or III). Would it be 10 Euros in this case?
In this simple example, the cost was easy to estimate. It was a full 100 Euros. However, the p=1/10 is irrelevant. If YOU get caught, you get caught independently of the probability suggested by past statistics. For you, statistics begin the moment you start the engine. The past is irrelevant. So, if you get caught by radar, the cost is 100 Euros if p=0.1, or 0.001 or even 0.0000001. The damage is the same regardless of the probability you may conjure up. Things either happen or they don’t.
In the case of more complex situations, the above equation is even more disputable. Consider earthquakes, flooding or terrorist attacks. How can one possibly make an a-priori estimation of the costs (or consequences)? How many people will perish? How many families will lose their homes? What will the impact on the economy be? How much will stocks fall? How long will it take to recover? What was the probability of 9/11 a year before it actually happened (we now know that it was 100%!)? What have the consequences been? Can any of this be estimated a-priori in a significant manner if even after the fact it is difficult to measure the costs of catastrophic events? What, then, is the real and practical value of similar calculations?
As an example, consider hurricane Katrina. According to Wikipedia:
The total damage from Katrina is estimated at $92.6 billion (2012 US dollars).
According to a NY Times article:
The “total damages/costs” of Katrina were $ 140 Billion (2012 dollars).
These two estimates alone differ by a factor of 50%.
Another eloquent example is from a FORTUNE article (by Nomi Prins, October 28, 2008) with the title “The Risk Fallacy”. The author writes” Wall Street thought it had risk all figured out. But the very systems the banks created to protect themselves are at the heart of the financial meltdown. If you visit Lehman Brother’s website today, more than a month after the bank’s plunge into bankruptcy, you can find the following words: The effective management of risks is one of the core strengths that has made Lehman Brothers so successful”.
The tightrope performer in the image at the top of this blog is risking 100% of his life, not 1% or 1/10% of it.
You liked what you read ? Leave a comment.
See more posts from Jacek Marczyk (Ontonixqcm) at ontonixqcm.blog
Ontonixqcm, Probability X Consequence offers a simple formula to prioritize business vulnerabilities. It typically does not make sense to fix everything, especially with a limited budget. A risk formula, may be a quick solutions for that purpose.
I think you are mistaking a sample from a distribution for an estimator of central tendency. Effectively you are missing the other times that the speeder didn’t get caught, overall the cost of speeding is distributed over multiple instances (tending to infinity)
Awesome – this is a paradigm changer. How do we propose to “calculate” risks? Because, if we dont know ‘what’ the risk is, how do we get to know ‘what we do in response’?
In our approach we do not “calculate” risks. What we do is to identify weak spots (in a corporation, an assemlby line, an industrial process, or in an IT network, etc.) and we help our clients concentrate on these spots which concentrate fragility. It is more or less like a doctor who says: “drink less and do some sport”, instead of saying “your probability of getting a heart attack is 1.275%”. This last statement is ridiculous and nobody would find it credible. So, the recipe is to try to be resilient as much as you can – especially where one is weak and fragile – and this will help you better face any adverse effects and circumstance, of which there is an infinite number out there. If you are in good health (i.e. resilient) you can better face thousands of diseases, but you don’t take thousands of vaccines, which is what modern risk management does.
Thanks for your response. To a guy like me, we seem to have now travellled the full circle – from good governance to ‘measure & manage’ to ‘focus on the basics’. Strong Governance is preventive. For sure.