The CEO’s Dilemma – Building Resilience in a Time of Uncertainty

Global disruptions and an increasingly complex macroeconomic outlook will be key elements of the strategic environment for the foreseeable future. For leaders, the only certainty is that waiting for clarity is a losing move. The best organizations know how to turn uncertainty into opportunity. Their playbook relies on two critical elements:

  • a shared and clear view of the world and the strategic challenges/opportunities it presents
  • and a resilient and adaptable plan to win.

A view of the world

Today’s global disruptions (e.g., geopolitical tensions, supply chain and economic headwinds (e.g., soaring inflation, rising interest rates, decelerating growth, and currency fluctuations)) have created a complex, once in a generation, competitive environment with significant variations across geographic areas and sectors.

Navigating this unprecedented complexity requires business leaders to develop a dynamic perspective not only on the most likely scenarios for how their operating and economic environments will evolve, but also on the distinct opportunities and risks these scenarios present for their organizations.

This research shows that “winners” in economic uncertainty do not just sit back and wait for recovery instead, they are proactive and turn ambiguity into opportunity.

A plan to win

There is no “one size fits all” solution to today’s complex strategic challenges. But this research suggests that the best companies do two things well in crafting their unique plans to win:

  • First, they have a clear understanding of their strategic starting point that takes into account nuanced and deaveraged perspectives on the economic and operational stability of the markets in which they operate as well as on their own organizations’ financial strength (e.g., profit volatility, free cash flow to debt ratio) ultimately falling into four high level starting point archetypes
  • And second, they embed a “dynamic strategy” mindset into their planning, comprising three elements:
    • Sensing: Observing trends, defining and monitoring critical uncertainties, and outlining a set of scenarios against which to assess business decisions
    • Adapting: Building operational and financial stability by shaping and reshaping strategies based on market trends and data driven forecasts
    • Thriving: Moving rapidly from assessment to action to seize growth opportunities and strengthen competitive advantage

Increasing uncertainty driven by a set of global disruptions and exacerbated by macroeconomic headwinds needs to be met head on.

Dramatic shifts in inflation drivers vary across regions and countries with energy emerging as one of the strongest drivers

Different sectors are affected differently by macro uncertainties

Sectors like agriculture are typically less vulnerable to business cycle shifts, while other sectors (e.g., media, tech, fashion) tend to be more affected. But this varies by recession depending on drivers.

Some sectors (e.g., retail), which were less vulnerable in the early 2000s recessions, are showing greater vulnerability in the current environment.

Top performers in economic uncertainty do not just wait for recovery; instead, they build competitive advantage and turn ambiguity into a source of opportunity.
Business leaders must balance contrasting priorities amid strong macroeconomic headwinds
Understanding the “starting point” is critical to successfully navigate this uncertainty

With the current disruptions and uncertainties, it is imperative for business leaders to reevaluate:

  1. The stability of their portfolio against economic downturns & market disruption
  2. The internal financial stability to cope with uncertainty

Each business context is distinct, but four starting-point archetypes can help leaders understand the moves most relevant for their organizations.

How to navigate uncertainty: Enhance resilience and secure clear pathway for sustained growth
The time to act is now

Take 3 key steps to navigate uncertainty and win in a downturn:

  1. Sensing macroeconomic and disruptive trends to shape (and reshape) future scenarios that guide strategic decisions
  2. Adapting business and functional strategies in response to new insights and to market, economic, and competitive developments
  3. Thriving by building competitive advantage to turn adversity into opportunity

Actions should be based on the specific business context.

Uncertainty Visualization

Uncertainty is inherent to most data and can enter the analysis pipeline during the measurement, modeling, and forecasting phases. Effectively communicating uncertainty is necessary for establishing scientific transparency. Further, people commonly assume that there is uncertainty in data analysis, and they need to know the nature of the uncertainty to make informed decisions.

However, understanding even the most conventional communications of uncertainty is highly challenging for novices and experts alike, which is due in part to the abstract nature of probability and ineffective communication techniques. Reasoning with uncertainty is unilaterally difficult, but researchers are revealing how some types of visualizations can improve decision-making in a variety of diverse contexts,

  • from hazard forecasting,
  • to healthcare communication,
  • to everyday decisions about transit.

Scholars have distinguished different types of uncertainty, including

  • aleatoric (irreducible randomness inherent in a process),
  • epistemic (uncertainty from a lack of knowledge that could theoretically be reduced given more information),
  • and ontological uncertainty (uncertainty about how accurately the modeling describes reality, which can only be described subjectively).

The term risk is also used in some decision-making fields to refer to quantified forms of aleatoric and epistemic uncertainty, whereas uncertainty is reserved for potential error or bias that remains unquantified. Here we use the term uncertainty to refer to quantified uncertainty that can be visualized, most commonly a probability distribution. This article begins with a brief overview of the common uncertainty visualization techniques and then elaborates on the cognitive theories that describe how the approaches influence judgments. The goal is to provide readers with the necessary theoretical infrastructure to critically evaluate the various visualization techniques in the context of their own audience and design constraints. Importantly, there is no one-size-fits-all uncertainty visualization approach guaranteed to improve decisions in all domains, nor even guarantees that presenting uncertainty to readers will necessarily improve judgments or trust. Therefore, visualization designers must think carefully about each of their design choices or risk adding more confusion to an already difficult decision process.

Uncertainty Visualization Design Space

There are two broad categories of uncertainty visualization techniques. The first are graphical annotations that can be used to show properties of a distribution, such as the mean, confidence/credible intervals, and distributional moments.

Numerous visualization techniques use the composition of marks (i.e., geometric primitives, such as dots, lines, and icons) to display uncertainty directly, as in error bars depicting confidence or credible intervals. Other approaches use marks to display uncertainty implicitly as an inherent property of the visualization. For example, hypothetical outcome plots (HOPs) are random draws from a distribution that are presented in an animated sequence, allowing viewers to form an intuitive impression of the uncertainty as they watch.

The second category of techniques focuses on mapping probability or confidence to a visual encoding channel. Visual encoding channels define the appearance of marks using controls such as color, position, and transparency. Techniques that use encoding channels have the added benefit of adjusting a mark that is already in use, such as making a mark more transparent if the uncertainty is high. Marks and encodings that both communicate uncertainty can be combined to create hybrid approaches, such as in contour box plots and probability density and interval plots.

More expressive visualizations provide a fuller picture of the data by depicting more properties, such as the nature of the distribution and outliers, which can be lost with intervals. Other work proposes that showing distributional information in a frequency format (e.g., 1 out of 10 rather than 10%) more naturally matches how people think about uncertainty and can improve performance.

Visualizations that represent frequencies tend to be highly effective communication tools, particularly for individuals with low numeracy (e.g., inability to work with numbers), and can help people overcome various decision-making biases.

Researchers have dedicated a significant amount of work to examining which visual encodings are most appropriate for communicating uncertainty, notably in geographic information systems and cartography. One goal of these approaches is to evoke a sensation of uncertainty, for example, using fuzziness, fogginess, or blur.

Other work that examines uncertainty encodings also seeks to make looking-up values more difficult when the uncertainty is high, such as value-suppressing color pallets.

Given that there is no one-size-fits-all technique, in the following sections, we detail the emerging cognitive theories that describe how and why each visualization technique functions.

VU1

Uncertainty Visualization Theories

The empirical evaluation of uncertainty visualizations is challenging. Many user experience goals (e.g., memorability, engagement, and enjoyment) and performance metrics (e.g., speed, accuracy, and cognitive load) can be considered when evaluating uncertainty visualizations. Beyond identifying the metrics of evaluation, even the most simple tasks have countless configurations. As a result, it is hard for any single study to sufficiently test the effects of a visualization to ensure that it is appropriate to use in all cases. Visualization guidelines based on a single or small set of studies are potentially incomplete. Theories can help bridge the gap between visualizations studies by identifying and synthesizing converging evidence, with the goal of helping scientists make predictions about how a visualization will be used. Understanding foundational theoretical frameworks will empower designers to think critically about the design constraints in their work and generate optimal solutions for their unique applications. The theories detailed in the next sections are only those that have mounting support from numerous evidence-based studies in various contexts. As an overview, The table provides a summary of the dominant theories in uncertainty visualization, along with proposed visualization techniques.

UV2

General Discussion

There are no one-size-fits-all uncertainty visualization approaches, which is why visualization designers must think carefully about each of their design choices or risk adding more confusion to an already difficult decision process. This article overviews many of the common uncertainty visualization techniques and the cognitive theory that describes how and why they function, to help designers think critically about their design choices. We focused on the uncertainty visualization methods and cognitive theories that have received the most support from converging measures (e.g., the practice of testing hypotheses in multiple ways), but there are many approaches not covered in this article that will likely prove to be exceptional visualization techniques in the future.

There is no single visualization technique we endorse, but there are some that should be critically considered before employing them. Intervals, such as error bars and the Cone of Uncertainty, can be particularly challenging for viewers. If a designer needs to show an interval, we also recommend displaying information that is more representative, such as a scatterplot, violin plot, gradient plot, ensemble plot, quantile dotplot, or HOP. Just showing an interval alone could lead people to conceptualize the data as categorical. As alluded to in the prior paragraph, combining various uncertainty visualization approaches may be a way to overcome issues with one technique or get the best of both worlds. For example, each animated draw in a hypothetical outcome plot could leave a trace that slowly builds into a static display such as a gradient plot, or animated draws could be used to help explain the creation of a static technique such as a density plot, error bar, or quantile dotplot. Media outlets such as the New York Times have presented animated dots in a simulation to show inequalities in wealth distribution due to race. More research is needed to understand if and how various uncertainty visualization techniques function together. It is possible that combining techniques is useful in some cases, but new and undocumented issues may arise when approaches are combined.

In closing, we stress the importance of empirically testing each uncertainty visualization approach. As noted in numerous papers, the way that people reason with uncertainty is non-intuitive, which can be exacerbated when uncertainty information is communicated visually. Evaluating uncertainty visualizations can also be challenging, but it is necessary to ensure that people correctly interpret a display. A recent survey of uncertainty visualization evaluations offers practical guidance on how to test uncertainty visualization techniques.

Click her to access the entire article in Handbook of Computational Statistics and Data Science

Banks sailing in uncertain waters

The decision-making process apparent paradox

Corporate decision-making processes are driven by seemingly opposing forces.

On the  one hand, the human urge to dispose of instruments emerges in order

  • to understand context, specific self-direction
  • and to implement the actions required for following the plotted course.

On the other hand, the exhortation to keep the mind open

  • to an array of possible future scenarios,
  • to imagine and grasp the implications of the various possible trajectories,
  • to plot alternative courses according to the obstacles and opportunities encountered, that could lead to landing places other than those contemplated originally.

Needs that are intertwined as never before whenever the decision-maker operates in an area such as the banking sector, that is characterised by extremely pervasive regulatory requirements concerning the

  • maintenance and use of capital,
  • liquidity management,
  • checks on lending and distribution policies,

and that is structurally exposed to the volatility of the macroeconomic context and financial markets, greatly increasing the range of possible scenarios.

Thus, it is far from surprising or infrequent that one of the most common questions that CEOs ask the technical structures responsible for budgeting and risk planning is: ‘what if’? (‘what would happen if…?’). The problem is that, in the last few years, the ‘ifs’ at hand have rapidly multiplied, as there has been an exponential increase in the controlling variables for which feedback is required:

  • Net Interest Income (NII);
  • Cost Income ratio (C/I);
  • Return On Equity (ROE);
  • Non Performing Exposure (NPE) Ratio;
  • Liquidity Coverage Ratio (LCR);
  • Expected Credit Loss (ECL);
  • Common Equity Tier 1 (CET1) ratio,

to cite but a few among the most widespread. Planning has turned into an interdisciplinary and convoluted exercise, an issue hard to solve for CFOs and CROs in particular (naturally, should they not operate in close cooperation).

This greater complexity can result in the progressive loss of quality of the banks’ decision-making process, more often than not based on an incomplete information framework, whenever some controlling variables are unavailable, or even incorrect when there is an actual lack of information, specialist expertise and/or the instruments required for the modelling of events.

Partial mitigating circumstances include the fact that such events, aside from being numerous, are interdependent in their impact on the bank’s results and are particularly heterogeneous. These can in fact be exogenous (turbulence and interference along the way) or endogenous (the actions that the helmsman and the crew implement during navigation).

In the first case, these events are beyond the control of those responsible for the decision-making process, determined by the evolution of the market conditions and/or the choices of institutional subjects. As such, they are often hard to predict in their likelihood of occurrence, intensity, timing and duration. By nature, such phenomena are characterised by complex interactions, that make it crucial, albeit arduous, to comprehend the cause-effect mechanisms governing them. Lastly, their relevance is not absolute, but relative, in that it depends on the degree of reactivity of the bank’s business model and budgetary structure to the single risk factors to which the market value of the banks’ assets is exposed.

Conversely, in the case of endogenous events, uncertainty is more correlated to the ability of the bank’s top management

  • to quantify the level of ambition of the business actions,
  • to assess their multiple implications,
  • and specifically, to the bank’s actual ability to implement them within requested time frames and terms.

The taxonomy of banking strategic planning

Although these complexities are increasingly obvious, many banks still remain convinced about getting started on their respective courses with certainty, exposing themselves to a range of risks that can restrict or irreversibly compromise the efficacy of the decision-making processes. Some institutions are indeed persuaded that an ‘expert-based’ approach that has always characterised their planning methodologies shall continue to be sufficient and appropriate for steering the bank, also in future.

History teaches us that things have not always worked out that way. These actors have yet to understand that it has now become vital to foster the evolution of the planning process towards a model relying upon analytical methodologies and highly sophisticated and technological instruments (risk management, econometrics, statistics, financial engineering, …), making them available to those that have always considered experience, business knowledge and budgetary dynamics to be privileged instruments for making decisions.

Second mistake: many banks believe the uncertainty analysis to be wasteful and redundant for the purposes of planning since, ultimately, the allocation of objectives is (and will remain) based on assumptions and uniquely identified scenarios. In this case, the risk lies in failing to understand that, in actual fact, a broader analysis of possible scenarios contributes to better delineating the assigned objectives, by separating the external conditions from the contribution provided by internal actions. Moreover, testing various hypotheses and combinations of cases makes it easier to calibrate the ‘level of managerial ambition’, in line with the actual potential of the organisational structure and with the full involvement of the business functions responsible for attaining the corporate objectives.

The intersection of these two misreadings of the context results in a different positioning of the bank, with the relative risks and opportunities.

Models

ILLUMINATED

The planning process is built upon analytical data and models developed with the contribution of subject matter experts of different origins, which allows to consider the impacts of a specific scenario on the bank’s budget simultaneously and coherently. Nevertheless, not only does it improve the planning of a specific item, but it disposes of appropriate instruments to switch to a multi-scenario perspective and investigate the relevant scenarios for management, appraising the volatility regarding the expected results. This transition is extremely delicate: it entails a change in the way prospective information is produced by the technical functions and subsequently channelled to the top management and board of directors. In this context, the bank is governed via the analysis of deterministic scenarios and the statistical analysis of the probability distributions of the variables of interest. Leveraging this set of information (much more abundant and articulated than the traditional one) targets, risk propensity levels and relative alert and tolerance thresholds are established; business owners are provided not only with the final objectives, but also with details concerning the key risk factors (endogenous and exogenous alike) that might represent critical or success factors and the respective probabilities of occurrence.

DELUDED

The budget planning process is characterised by the prevalence of an expert-based approach (with a limited capacity of integrating quantitative models and methodologies, in that not always all budget items are developed by relying on the necessary instruments and expertise) and aimed at forecasting a single baseline scenario (the one under which the budget objectives are to be formalised, then articulated on the organisational units and business combinations).

ENLIGHTENED

The budgetary planning process is very accurate and incorporates specialist expertise (often cross-functional) required to understand and transmit the interactions across the managerial phenomena so as to ensure a full grasp of the bank’s ongoing context. The onus is chiefly on the ability to explain the phenomena inside the bank without prejudice to the external baseline scenario, that is ‘given’ by definition.

MISSING

The planning process attempts to consider the impact of alternative scenarios as compared to the baseline scenario, however, it is implemented on the basis of imprecise or incomplete modelling, in that developed without the analytical foundations and instruments required to appraise the consistency and the degree of likelihood of these scenarios, useful tools to sustain such a serious consideration. The focus remains on the comparison across the results produced under diverse conditions, while taking into account the approximations used.

Click here to access Prometeia’s white paper