Scottish economist Alec Cairncross once wrote that “there are no more dangerous men in government” than those who think that statistical figures are exact facts. The point Cairncross was driving at (“Economic Forecasting,” The Economic Journal, Dec., 1969) is that a politician who believes that he has exact knowledge of the future will try to shape that future with the same degree of precision as he believes his facts represent.
There is no such thing as exact knowledge of the future. Yet throughout human history, overconfidence in one’s ability to predict—and even more so shape—the future has led many a politician into endeavors with disastrous unintended consequences and incalculable loss of lives, liberty, and property.
This lesson of history is even more important in times of uncertainty.
Consequential decision making
When events happen with great disruptive repercussions, we are all hurled into uncertainty where our confidence in our plans for the future is tarnished or even lost. War is the worst disruptor of them all: it is man-made and therefore preventable, and it is meant to upset life as we know it. Other significant events can have an impact of the comparable severity, such as terrorist attacks (9/11 comes to mind) or an economic depression.
Just as man-made disasters can be caused by hubris-minded politicians, those who respond to a disaster can make the mistake that Cairncross points to. In times of uncertainty, a rush to action often brings the decision maker to over-confidently rely on whatever scant information may be available.
To some degree, this is understandable. Leaders of governments and businesses have to make decisions even when the future is engulfed in uncertainty. While many of us have the privilege to postpone major decisions, such as buying a house or starting a new career, heads of government and large corporations must secure the continued operation of the organizations they are responsible for.
Since uncertainty makes errors easy, it is essential that our leaders understand the perils and merits of rational decision-making in times of uncertainty. The repercussions of even one erroneous decision can be catastrophic: a business can go under, and a government can throw its nation into economic and social strife. Wars can be provoked, won, and lost based on a single decision by one person.
Our history has been shaped by many such decisions, both for the better and for the worse. Few examples are as impressive as General Eisenhower, who was responsible for the decision to invade Normandy in June 1944. With the outcome of the invasion being genuinely uncertain, and with a world-shaping difference between success and failure, at that moment Eisenhower is said to have been the loneliest man in the world.
The impact of uncertainty is often made more pressing by the fact that every generation of leaders, in politics as well as in business, encounters only one, perhaps two episodes of such pivotal importance during their careers. Just to take one, by comparison, rather mundane example: how many of Europe’s current influential legislators, or current heads of government, were in office during the Great Recession 13 years ago? The years 2009-2011 drastically redesigned the economic landscape of many European countries.
Fortunately, there are ways to navigate an uncertain world, both for us as individuals and for people with greater responsibilities. One of the best places to find guidance is in the old-school academic discipline of political economy. For the better part of the 20th century, its practitioners were dedicated to the concept of uncertainty in both theory and practice.
The literature they produced was increasingly marginalized when political economy gradually gave way to what we in modern days know as “economics” proper. For reasons outside the scope of this essay, economists have little to no interest in the concept of uncertainty, but their political-economy predecessors thought it essential to understand how uncertainty influenced human decision-making.
Information, risk, and uncertainty
When uncertainty reigns, making the future sufficiently predictable requires a great deal of humility on behalf of the decision maker (a character trait not always in adequate supply). It also requires information with some degree of reliability. To illustrate the difficulties in obtaining such information in disruptive times, consider as one small example what the European energy supply will look like if Russia wins or loses, respectively, the war in Ukraine.
Most decisions in politics, especially in times of uncertainty, are considerably more consequential than decisions made in isolated industries or markets in the economy. Therefore, reducing uncertainty is essential; the first step to do so is preceded by the recognition that in times of uncertainty, information is always proximate and never exact.
Convincing political leaders that they cannot obtain exact information is a challenge on any day; on this point, my professional experience corroborates what Alec Cairncross explained above. Leaders in government tend to believe the opposite, refusing to consider advice unless they are told that the information they get is exact in nature. For this reason, Cairncross warns, by default we should assume that “all forecasts are ipso facto wrong.”
As uncertainty engulfs the future, exact information (such as but not limited to statistical facts) becomes increasingly unattainable. Yet decisions must be made; the forecaster as well as the decision maker must learn to rely on proximate information. This is done first and foremost by means of a distinction between risk and uncertainty.
In much of the academic literature, as well as in conventional wisdom, risk and uncertainty are seen as interchangeable, sometimes even indistinguishable. Economists are notorious for making this mistake, with John Hicks providing a classic example in his “Theory of Uncertainty and Profit”(Economica, May 1931).
Once again, practitioners of political economy come to the rescue, among them British economist George Shackle. In Epistemics and Economics (p. 155) he notes that exact information, or “assured knowledge” in his terminology, is often fragmented and consists of “disconnected” pieces. Much like stars in the night sky, they leave large swaths of uncertainty between them.
The process to fill that void and thus facilitate decision making under uncertainty, is necessarily indirect and qualitative by default. A decision maker trying to build a confident—or rational—picture of the future, Shackle points out, cannot rely on “rational inference” as that would be the same as having access to exact information.
So how do we build sufficiently confident information in times of uncertainty?
Parting with exact knowledge
At the heart of every forecast lies the need to predict what others will do, e.g., other government leaders or executives of competing businesses. However, that information is not always readily available, especially when everyone is faced with uncertainty. In those situations, past information cannot adequately inform the future, which—technically speaking—makes it a bad idea to rely on inference from past experience.
Instead of relying on past experience, the best source of information is intentions for the future. Unlike inference, which relies on exact information about the past, deducing from how others plan to act in the immediate future means that our decisions are based on secondary information.
For a thorough examination of this decision theory in the context of business forecasting, see Armen Alchian: “Uncertainty, Evolution and Economic Theory,” Journal of Political Economy, June 1950.
One of the best formulas for decision-making under uncertainty, A Treatise on Probability by John Maynard Keynes, was published in 1921. Originally written as doctoral dissertation, Keynes uses this book to explain probability not as a mathematical problem, but a logical one. This allows Keynes to develop a concept for decision making that does not rely on quantifiable information. Instead, it is built on the distinction between the ontology and epistemology of knowledge.
Logic and probability
What sounds complicated really isn’t; the basis for Keynes’s reasoning is the classic logical expression “p→q,, in other words “if p, then q.” If I serve my French brother-in-law English wine, then he will be happy.
The statement p→q, Keynes notes, is a statement of certainty: there is no doubt that p→q is true. However, for purposes that will be apparent in a moment, he separates this certainty into two layers:
- The ontological layer—when p happens, q always follows; and
- The epistemological layer—I know that when p happens, q always follows.
To see the relevance in separating them, let us return to my French brother-in-law. It may be ontologically certain that he prefers English wine over French wine, but not knowing his wine preferences I am genuinely uncertain of his reaction. Having him and my sister over for dinner and contemplating whether to serve a bottle of my finest English wine, I can only conjecture his reaction.
In short, it is not epistemologically certain that my brother-in-law will prefer English wine over French wine.
I could solve the problem by calling my sister to ask which wine her husband would prefer. However, if my sister replies that he has never been offered English wine, the first-hand information route is closed. My brother-in-law’s wine preference is uncertain, both ontologically and epistemologically.
There is no way of knowing whether me serving English wine, p, will make my brother-in-law happy, q, or unhappy, -q.
The serving of English wine is, of course, a metaphor for an event so unusual that no decision maker can reliably infer a causality from past experience. Without exact knowledge, we cannot be certain whether p→q.
But it gets worse: we cannot even estimate the probability of p→q.
Suppose my French brother-in-law had been served English wine X number of times, and in Y instances he had been happy with it. In this case I could divide Y by X and get a quantitative estimate of the probability that p→q. But in the absence of any prior instances that are even remotely comparable to the present, the concept of probability ceases to be quantitative, or mathematical.
It becomes qualitative, or logical.
Keynes offers a solution by adding a variable to the causal “if, then” relation. This variable is, so to speak, a probability filter between cause and effect:
p≻z≻q
We read this as “if p, then q given probability z.”
Interestingly, z has the same two meanings that certainty has: ontological and epistemological. The ontological meaning is simple:
When p happens, q follows by a probability of z
The epistemological meaning is less obvious. It says that
I know that when p happens, the probability for q happening is z
The question is how we can establish this probability. As Alchian points out (reference above), the least unreliable information is that which involves exhibits of other decision-makers’ plans. They are unlikely to have any more reliable information than we do, but when a head of government evaluates the plans of other heads of government, he can assess to what extent it is reasonable to copy or reject their plans.
By knowing their general character, he may deduce a plan of his own without any further knowledge. This may or may not be a successful path forward, but his reliance on indirect information is a starting point for estimating “z,” i.e., the probability that q will follow p. This set of secondary information allows the decision maker to build an image of the future, to which he adds his own individual qualities and preferences. Eventually, he can judge to what extent this image is a reliable platform for decisions about the future.
However we handle situations of uncertainty, if history and scholarship teaches us anything, it is that when humility trumps hubris, the consequences of our decisions are always better. This is perhaps the most profound, but also the most difficult lesson for humans to learn.