Frank Romeike | RiskNET – The Risk Management Network
“I know that I know nothing” is a timeless saying attributed to the Greek philosopher Socrates by the Roman politician and philosopher, Marcus Tullius Cicero. He probably wanted to express that he was lacking wisdom or all-embracing knowledge without any doubts. True human wisdom, however, is to be aware of the absence of knowledge in the need for knowledge.
By why is “the state of not knowing” so difficult to put into words? Quite simple: it doesn’t exist. It is ludicrous that ignorance is often cited as an antonym for knowledge. In today’s world, we do have knowledge deficiencies – for example, if the universe is infinite or transient, or how viruses and bacteria positively and negatively affect the environment, animals or our bodies. Yet at the same time, we are being flooded with a tsunami of useless information. The amount of information – but not the amount of useful information – increases by 2.5 quintillion bytes each and every day. As a result, people are having a harder time differentiating relevant signals from all the other distractions. In a corporate world, making decisions and taking actions under uncertain conditions is an everyday event. Managing directors, executives and even politicians must cope with uncertain scenarios and make decisions day in, day out.
What is essential here is to be aware of the knowledge deficits, weigh different scenarios from various perspectives, and then reach a decision. This requires an assessment culture rooted in interdisciplinary, constructive discourse. The other option is to focus on a (desired) scenario and invent alternative facts, which, unfortunately, is the rule and not the exception in politics today.
The way that most companies typically evaluate risks today illustrates just how lightheartedly uncertainty is often taken. “Experts” evaluate each risk based on its probability of occurrence and extent of damages as if they had seen the future in a crystal ball. There is a presumption of knowledge that does not exist.
Stochastic statements, in contrast, would deliver a range of potential scenarios. We simply do not know which possible surprises lie in the future. Accordingly, risks should be evaluated in an interdisciplinary approach using a range of potential scenarios. Sound risk analysis shows a realistic range for the future development and avoids presumed accuracy and single scenarios that do not exist. The simplest case evaluates three scenarios: worse-case, realistic and optimistic. The world of stochastics and probabilistic models give us knowledge that is more multifaceted and versatile but no more precise.
In the real world, you will never have perfect information. Risk analysis, therefore, can cope with poor data and aid in creating an optimal analysis of the information that is available. One excellent use case for unserious ways of dealing with uncertainty is the ongoing discussion for evaluating the risk of SARS-Cov-2 (COVID-19). The actions defined were based on a monodisciplinary analysis of virologists without consulting experts in other areas. The presented studies were pronounced as gospel truth although statisticians and “data ethicists” are sickened after taking a closer look. Over the past few months, various methodical deficits have become evident [e.g. in the usage of a regression analysis, a wrong evaluation of the error rate in so-called polymerase chain reaction (PCR) tests, incorrect quotients in calculations of the case fatality rate]. The communication of the “truth” dealing with COVID-19 is, above all, an evidence fiasco. The model calculations of many studies, including one from the Imperial College, used highly inaccurate input variables, which then lead to the well-known GIGO (garbage in, garbage out) effect. Few, if any, references were ever made to the uncertainty regarding the parameterization.
A serious approach would have been to communicate the results in ranges as the Stanford University School of Medicine did with a 0.12% to 0.2% lethality rate for Santa Clara County. The results of the Heinsberg study listed a lethality rate of 0.37% in Gangelt in Heinsberg County. If a beta distribution with a 95% confidence rate is presumed, it can be assumed that the actual lethality rate lies between 0.15% and 0.69%. Since the density function is positively skewed, the expected value (mean) for lethality probably lies more at 0.32%.
All risks that are marked by a high degree of uncertainty can be anticipated using scenarios and ranges without a “presumption of knowledge”. No IT experts should presume their ability to estimate the potential scenarios of the effects of a cyberattack as a point estimate of a probability of occurrence and an extent of damages (as is suggested by many established international IT security standards). Such an estimate would be a pure presumption of knowledge that we simply do not possess. To further back this this statement, I encourage all readers to delve deeper in to the effect mechanisms of the NotPetya attacks on the corporate group A.P. Møller-Mærsk.
A scenario analysis provides a way to seriously address the uncertainty of potential cyberrisks. How often can we calculate with a certain scenario? And if such a scenario should ever occur, which effects could this possibly have on revenues, our reputation or the amount of work involved – evaluated respectively as worst-case, realistic and optimistic scenarios. The evaluation is made using suitable statistical distribution functions that correctly portray uncertainty in the assessment of the parameter. In the case of cyberrisks, for example, the evaluation can be made using a compound distribution that intelligently links the estimated frequency (based on a Poisson distribution) and the estimated effect scenarios (based on a PERT or beta distribution). The evaluated scenarios lay the foundation for a stochastic simulation to anticipate the complete range of potential financial effects for the company. The simulation does not produce a single truth (in the sense of a scenario), but rather a complete range of scenarios (including expected values and stress scenarios). A sensitivity analysis can also be used to prioritize actions and analyze them based on their effects and efficiency.
The stochastic scenario simulation intelligently combines expert knowledge (including intuition and gut instincts) with powerful statistical tools. Statistical thinking leads to higher competency in dealing with uncertainty. Understanding statistics is a necessary skill (not just for risk managers) to classify and evaluate the world in which we live and reach decisions amid uncertainty. To paraphrase the Indian statistician C.R. Rao: Secure knowledge ensues from a new way of thinking that combines uncertain knowledge with knowledge about the extent of the uncertainty. Similar to a statistician, a risk manager should also possess the following four competencies:
Over the past few months, I have come to realize that an interdisciplinary, substantiated risk analysis would have been necessary as a foundation for making decisions even during the “high risk situation” COVID-19 crisis. It still does not exist yet to this day, which is nothing other than the complete ignoration of the entire knowledge from risk research. Is the reason, perhaps, that many virologists, epidemiologists and politicians do not know enough about making decisions amid risks? They do not understand that stochastic statements and, therefore, showing uncertainty is not a sign of weakness but rather a strength of.
Renn, Ortwin (2019): Gefühlte Wahrheiten – Orientierung in Zeiten postfaktischer Verunsicherung, Verlag Barbara Budrich, Opladen 2019.
Schüller, Katharina (2015): Statistik und Intuition – Alltagsbeispiele kritisch hinterfragt, Springer Spektrum, Berlin 2015.
Romeike, Frank/Hager, Peter (2020): Erfolgsfaktor Risikomanagement 4.0: Methoden, Prozess, Organisation und Risikokultur, 4. komplett überarbeitete Auflage, Springer Verlag, Wiesbaden 2020.
Informieren Sie sich zu aktuellen Trends und Fakten und verpassen Sie keine News über Events, Webinare, Podcast-Folgen oder Trainings.