Uncertainties, intelligence, and risk management: a few observations and recommendations on measuring and managing risk.

AuthorPate-Cornell, Elisabeth
  1. UNCERTAINTIES AS A CRITICAL PART OF INFORMED DECISIONS II. WHAT WE KNOW AND WHAT WE DON'T: GATHERING AND COMMUNICATING IMPERFECT INFORMATION A. Recognize and communicate uncertainties B. Don't present the most likely hypothesis as if you were sure of it C. Simplify messages when appropriate and avoid large numbers of complex scenarios D. Don't make "predictions" when you are not certain of what is ahead III. EVIDENCE MEANS MUCH MORE THAN STATISTICS A. To assess a risk, one often needs Bayesian probability B. Expert opinions are often essential to the assessment of uncertainty C. Don't truncate the evidence base to obscure the uncertainties IV. PROCESSING THE EVIDENCE: WHAT TO MAKE OF NEW INFORMATION A. Don't forget prior information: it may remain an important factor B. More information does not mean less uncertainty C. Don't assume independence of events and factors without checking it V. THE ROLE OF INTELLIGENCE IS TO SUPPORT RISK MANAGEMENT A. Effective warning systems are at the core of risk management strategies B. "Black swans" and "perfect storms" are often poor excuses for bad risk management VI. CONCLUSION: VALUE INTUITION ABOUT UNCERTAINTIES, BUT CHECK ITS LOGIC I. UNCERTAINTIES AS A CRITICAL PART OF INFORMED DECISIONS

    Classical intelligence refers both to data that governments and industries gather in their efforts to protect information critical to their security and competitiveness, and to information that they steal from others because it is useful to their objectives. But that information also includes what patients, parents, consumers--just about everybody--wants to know to support decisions, especially when it comes to safety for themselves and their loved ones. In the classic sense, the problem for industries and governments is to collect and analyze systematically the information that they need, to avoid missing the next big thing or the next big threat. But this has to be done with a sense of priority to best use limited resources. This paper focuses on a critical part of a wide class of risk management problems: How to report and communicate uncertainties. The objective is to make sure that the right message is passed along in time to decision-makers by an "analyst," (whoever that may be) whose job it is to report facts but not to pick options. The decisions themselves, whether made by the President or a hospital patient, include the preferences and risk attitudes of those who make them.

    Engineering risk analysts face the problem of assessing the safety or the reliability of systems (including people, operators, managers or users) with which there is sometimes limited experience. What this author has learned in that domain is true for intelligence in many forms (observations, signals, warnings, tests or common beliefs). (1) Obviously, specific confidential or classified intelligence cases are not discussed explicitly here and one does not always know the details of what happened behind the scenes. Examples from other fields illustrate just as well some points that apply across the board. What follows are a few observations and recommendations that are not new, but perhaps worth restating, not only in the context of intelligence analysis but also in medicine, engineering, finance, or policymaking. (2) Reducing uncertainty is indeed useful whenever feasible. Thomas Fingar, for instance, titles his book on intelligence analysis, Reducing Uncertainty, (3) Yet one of the key points of this paper is that the objective of the intelligence community is to provide the most accurate description--and generally, quantification--of current knowledge. Indeed, as discussed below, more information may be essential, even if it does not reduce uncertainty. (4)

  2. WHAT WE KNOW AND WHAT WE DON'T: GATHERING AND COMMUNICATING IMPERFECT INFORMATION

    1. Recognize and communicate uncertainties

      Whether intelligence involves national security, business, or what is happening in the neighborhood, the signal is often fuzzy and the information incomplete. It is easy to pick one possibility, jump to conclusions, and, for the reporter or the analyst, to make a definite but incomplete statement. (5) Indeed, an intelligence analyst may be pressed to "make the call" and to pick one possibility among others. But the world is seldom perfectly clear, and if the information is used to make an important decision, President Truman's proverbial "two-handed economist"--trying to communicate what is known and what is not--may have had a point. (6) The decision-maker needs an appropriate report of uncertainties, in a way that is comprehensible and useful.

      The analyst's mission is to assess a full range of possibilities and put the message in perspective, not to influence decisions unintentionally based on biased or emotional interpretation of incomplete knowledge. It may be easier--and more rewarding--to provide a definitive message ("X will win the elections"), in hopes of being proven right. But even if X is indeed elected, the statement before the fact was not right because it was incomplete and misleading. The analyst's problem in that respect is a common one. It is not very different from that of lawyers convincing a client to go to trial, believing that they will win; of medical doctors pressing a patient to go through a surgical procedure, overstating the chances of success; or of an engineer claiming that a new system is 100% reliable. In many cases, it may be tempting to simplify the message in order to look confident. Yet when uncertainties remain, the message is incorrect--even if in the end, reality matches the bet--because it does not provide full information at the time of the decision. This decision (about the level of risk to take) should be that of the principal, the patient or the client, not the specialist or the analyst.

    2. Don't present the most likely hypothesis as if you were sure of it

      When there are several possibilities, it is dangerous to pick what seems most probable and report it as if it were certain. One assumption may be much more likely than the others, but especially when the alternatives imply extreme outcomes, it is essential that the decision-maker be informed about the risk. An analyst or a fair reporter has to present what is known, what is not (and if unclear, with a degree of belief), and what has changed. At the start of the war in Iraq in 2003, for instance, there was an intelligence failure to accurately present complete information in the description of tubes, generally misidentified as uranium enrichment centrifuge components. (7) There were reasons to believe that Saddam Hussein had earlier tried to acquire centrifuges, but even though the probability that he had succeeded was high, there were other possibilities that were recognized by some but not given sufficient attention. (8)

      Clearly, as sometimes stated in medicine, when hearing sounds of hoofs, one should think of a horse before a zebra. Yet, a zebra with extreme consequences should be considered, if nothing else, as a reason to seek additional information. Indeed, life is about choices, but, more often than not, choices under uncertainty. It is thus mostly about bets--generally with a spectrum of multidimensional outcomes--and bettors who may want to know what they are facing.

    3. Simplify messages when appropriate and avoid large numbers of complex scenarios

      While it is important to present alternative hypotheses and their potential consequences, the message has to remain clear and concise. This may mean identifying and presenting a few classes of scenarios and their probabilities as opposed to a mass of detailed ones if those details are irrelevant to the decision. (9) One can expand scenarios ad infinitum by adding more factors. Yet even though doing so makes the story look more vivid and thus seem more likely, adding details to a scenario lowers its probability by definition, (10) and the details may not be relevant to risk management. (11) For example, the precise circumstances under which the breach in the hull of an oil tanker reached a given size may not be relevant in the assessment of the amount of oil spilled after the hull was breached (system states such as the breach size are called "pinch points"). (12) What matters may simply be how often and where a breach can happen (for instance the probability per voyage for a specific type of ship, and the chances of various locations given its path). In all cases, the results have to be presented in a useful form and the analyst has to decide where to stop when including factors and details. (13)

      Could the "Arab Spring" have been anticipated? The instability of the Arab world and the unpopularity of some of their leaders were clear. But analysts could not have identified the timing and the exact mechanism of the start of the insurrection in Tunisia in 2010, when a street vendor whose cart had been confiscated once again by the authorities committed suicide. The fact that the uprising finally occurred was not in and of itself a surprise, but the exact scenario that triggered it was not anticipated. One cannot expect analysts to get to that level of granularity.

      In that vein, "one-in-a-million" (or even "one-in-a-trillion"!) risk assessments are particularly unhelpful. It is a cliche with which people may attempt to express either quasi-certainty or, in retrospect, the unpredictability of past mishaps--that nothing could have been done better--simply by adding details that are irrelevant to risk management. On the flip side, one can doubt a seat-of-the-pants assessment of a "90% chance of mission success" followed by an acknowledgment that such statement would be contradicted by a logical quantitative risk analysis. (14) These kinds of statements may be considered "notional" and used--or ignored--as such. They simply reflect an unflinching optimism based on past experience with other situations and systems ... or a wet finger in the wind. Yet very low numbers can...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT