Analyzing Uncertainty

The illusion of predictability and certainty gives us comfort and helps us deal with a more indecipherable reality than we think. Furthermore, we tend to overestimate our ability to gauge, especially in our areas of expertise and on more trivial matters.

Richard Zeckhauser, the legendary Harvard professor, demonstrates this through a simple exercise in his classes. Students provide three estimates of ​​quantities such as some country's area, the population of another, or the votes received by two candidates in an election. The three requested quantities are a) the best estimate, b) the lowest, and c) the highest value deemed. The results consistently show that students tend to give a short range of possible values, overestimating their ability to estimate accurately.

Similarly, when we do not anticipate events, we explain why we should have expected them. This reaction is due to the hindsight bias: "I knew it would happen!" We tell ourselves. In addition, we tend to extrapolate past conditions to the future as a reaction to uncertainty.

It is easy to confuse financial models with reality in business, which is infinitely more complex. Organizational structures are naturally designed around predictability because it is efficient. Although companies have risk functions (which analyze and respond to events with known outcomes with known probabilities), the analysis of "uncertainty," that is, situations with known states of the world but unknown probabilities, are rarely systematized. These events include consequential ones such as a pandemic, political shocks, or technological disruptions. As a result, we lack the capabilities to identify our vulnerabilities preemptively; our strategies rest on hidden assumptions about how critical uncertainties will unfold. Generally, we see the disruption pass by when it does not overwhelm us.

In a time of systemic transformations and volatility, where events of this nature can have disproportionate impacts, organizations (private and public) must improve their skills to analyze uncertainty. It is more appropriate to say: in uncertainty lie opportunities than in risk.

The importance of tapping on predictive models, enabled today by the wide availability of data and the processing power of artificial intelligence, is commonly highlighted. But it remains to be emphasized the need of developing the fundamental capacity to distinguish uncertainty from risk and recognize that our strategies always have implicit assumptions about the future. Deciphering patterns of historical series provides us with a non-negligible level of control but concerning minimal situations of reality.

In addition to probabilistic thinking, organizations must systematize the reflection on their critical uncertainties. Different techniques such as scenario analysis, the development of disruptive hypotheses as an analytical exercise, or "backcasting" that refers to the definition of the desired future followed by reverse planning can help us incorporate the management of uncertainties into our strategy. This type of method has been grouped into a discipline called "strategic foresight," whose aspiration is not to predict the future but to imagine possible futures to strengthen strategies. In the words of J. Peter Scoblic, a consultant with award-winning doctoral work in the field, strategic anticipation "does not help us to know what to think about the future, but to know how to think about it."

Integrating strategic anticipation into their decision-making processes will enable organizations to develop alternative futures-proof strategies and improve crisis planning. The fact that we cannot predict with certainty does not mean that we are left to chance amid chaos; on the contrary, the human capacity to shape the future will always be the keynote of our history.