“No amount of sophistication is going to allay the fact that all of your knowledge is about the past and all your decisions are about the future." - Ian Wilson (former GE Exec)
Thank you for joining me for blog 20 (boom!) in the series, highlighting decision-making and the brain. This is my public exploration of what drives decision-making and how we can use that information to make better decisions, resulting in better outcomes.
Today's blog is a list of attributes that have continually come up in my research on forecasting. As an aside, if you wanted one book that got you a quick pithy insight into this area, then "Superforecasting" is the one I would go for. Superforecasting is a great book. It ignores famous forecasters or those with the most degrees on their CVs and concentrates on people who have successfully made tough forecasts and kept score.
The way these superforecasters have honed their forecasting skills is by regularly making forecasts like:
“Will there be a violent incident in the South China Sea this year that kills at least one person?”
As you can tell from this question, there is not much room for waffle here. You can't talk about 'tensions' or good and evil, it is a clear prediction that is required. I have written previously about vagueness, it robs both the maker and recipient of a forecast to have anything actionable. Superforecasters see if their predictions were accurate and use the feedback loop to try and refine their skills. Given Superforecasters have taken the time and effort to test, score and refine their skills, we should heed what they have to say about their art.

Forecasts and Decision-making
From my research, I have concluded that making forecasts is a crucial part of decision-making. In order to compare various courses of action, you have to forecast the implications of each course of action. It is upon those forecasts that you make your decision.
It's probably worth illustrating this point with an example. When you compare two candidates for a job, although it might seem you are comparing one attribute with another, in a very dry and methodical way, there is an element of trying to forecast how this person with a set of distinct skills will fit into the organisation and what the result will be.
Many wise people (Warren Buffett etc) have been vocal about the fact that they do not rely on forecasts. This seems to be based on two premises 1) You should get your comfort around how much margin of safety you have rather than how much weight you place on a long range forecast 2) Most forecasts are produced by people who have incentives to stack the evidence to favour a particular course of action. I'm not in the business of betting against Warren, so I think it's important to know when forecasts are useful. In investing, I too have found forecasts to be of limited use, but I can see where forecasts are useful. An example would be a government providing an ageing population with the services it needs - having a sense of capacity required is helpful.
I have split the tips into three distinct categories, a) Approaching forecasting, b) Making good forecasts and c) Improving forecasting.
a) Approaching forecasting:
1) Choose the right things to forecast:
Satellite paths can be predicted with great accuracy, and making a satellite prediction is almost direct maths. More complex systems where there are many dimensions to it, like politics and the price of oil in 2years time are very tough to predict. Specifically, trying to predict who will be President of the USA in 3 elections' time is low efficiency game. We have no idea of the political climate, or the candidates. The longer out and more complex the system the less likely your prediction is to be meaningful. More importantly, do not be fooled by forecasts that others make in complex systems far out, it is probably as valuable as a roll of the dice.
2) Break down problems into sub-problems
Break down problems into chunks and try and work out or estimate each part. A lonely man in London called Peter Backus, once tried to work out how many potential female partners there were in London for him. It's a bit stalkerish, but a good example...
He started with the population of London (6 million) --> Then calculated the female share (6mn x 50% = 3mn) --> Then worked out the single population (50% x 3mn = 1.5mn) --> Then worked out those of the right age (20% x 1.5mn = 300k) --> Then worked out university graduates (26% x 300k = 78k).
He then tried to work out using this estimate those that would be compatible. 78k--> those that would find him attractive (78k x 5% = 3900) --> those that he would find attractive (5% x 3900 = 195) --> those that have compatible personalities (195 * 10% = 20). So some population statistics and some heroic assumptions got him to 20 (you can decide whether lucky or unlucky) ladies.
3) Strike the right balance between inside and outside views
Let's say you were trying to estimate the length of the 2022 Russian invasion into Ukraine. Whilst it feels like a unique event, it is certainly not the first war of invasion we have seen and therefore we can employ an outside view. The screenshot below is from Wikipedia which is a fine place to start.

An outside view is trying to find a broad category of similar events to the ones you are forecasting and then using the information to help anchor your estimate. As to answer the question about how long the Russian invasion would take, the data can be seen above from Wikipedia is readily available. The idea that the Russians would be in and out in a short period of time does not make sense in our data set. Most wars have resulted in border changes, weakening of opponents, losses and annexations and unless there is a specific mandate to weaken an enemy and withdraw, the invasions tend to last years.
Once we have a good anchor, we can start the process of adjustment i.e. answer the question, 'how should this conflict differ from the norm?'. Are there any factors that mean this is likely to be a longer or shorter war?
Longer war: In this case, we have neighbouring countries so supply chains are not an issue, we also have a heavily armed Russia facing a well supplied Ukraine, so neither side look likely to run out of armaments anytime soon.
Shorter war: Russia might wish to avoid being sanctioned by the West, we have very important natural resources at stake here meaning incentives to making this shorter.
We then using some art and some science try and adjust our estimate higher or lower depending on the weight of evidence.
4) Take ideology off the table
"The hedgehogs are more the big idea people, more decisive," while the foxes are more accepting of nuance, more open to using different approaches with different problems.
When forecasters methods were observed, there were two clear styles of forecasting. The first one was by 'hedgehogs', who had one big idea. They seemed to be really confident and decisive and viewed most things through the narrow lens of that idea. The second style was by 'foxes' more nuanced, more open and more gradual. You won't be surprised to hear that the second group, the foxes were better forecasters. Foxes did not have to defend a particular approach that they were known for and were not bound by ideology.
In making practical decisions, or forecasts, ideology is harmful.
b) Making good forecasts:
5) Think in probabilities
Very few things are certain (100%) or impossible (0%). So having as many shades of maybe is important and not rely on maybe as (50%). One thing that many forecasters have learned to do, is try and use their experience to differentiate between probabilities on a spectrum e.g. difference between 65% and 75%. These small differences become large when the outcomes are important (e.g. nuclear war) and problems have lots of components.
6) Get the best out of your team
Whilst group dynamics will be the feature of a later post, it is clear from the work done by superforecasters that well-structured teams produce better results. There is wisdom in the crowd, if people who have done lots of prep work, come together and debate their differences in approach within a non-hierarchical group (different from normal committee structures).
7) Have the right level of confidence
Don't be overconfident: when forecasting the probability of whether an event occurs or not, there are two offsetting ideas we need to bear in mind. We tend to be overconfident when we have performed a lot of calculations to create an anchor, but in reality we need to realise our assumptions have lots of room for error.
Don't be underconfident: people feel uncomfortable moving away from their anchor. A common error is not using the unique information of the situation boldly enough and adjusting too meekly.
c) Improving forecasting:
8) React appropriately to new evidence
No sacred cows. One of the biggest decision mistakes, is not appropriately using new information to update your thesis. It take a lot of effort to rethink ideas when disconfirming evidence comes to light, but it is often more damaging not to take the pain whilst there is something you can do about it. Whether it is a loss of trust in management or market actions that have gone way further than anyone expected, we need to take in the information that challenges our thesis as much as we revel in information that confirms our thesis.
9) Learn to better forecast by keeping score of previous forecasts
Without a written record of your decisions, the basis for the decisions and risks to the decision, you will not fully remember why you put on a position, what later happened and what you have learnt from it. New information leaks into our memories of events, and we get hindsight bias e.g. something that was not likely when you first made the decision and slowly became more likely over time, will likely be remembered as something you were aware of at the time of decision. Using this process well, failure becomes an incredible teacher.
10) Have humility
Remember, if you care about making good decisions, check in your laziness, arrogance, complacency and ego at the door. Remember, your job is to find the right decision, not defend a previous one or an ideology. Some of the best hedge fund managers have stories about when they started a trade, realised it did not work, and then did the complete opposite. That ability to turn on dime by not being attached to a previous decision, is why people trust them with decision making.
So what?
How is this all relevant to decision-making? Here are three take-aways I want to leave you with before we pick it up next week:
1) When approaching forecasts focus on decisions that rely on less complex systems to have a shot of forecasting well. Decisions are best made free from ideology. More complex problems are best unpicked by breaking complex problems down and appropriately using parallels from similar historical situations to ground forecasts.
2) To make good forecasts think hard about trying put a probability to it. Work with teams effectively to get the best result possible with the raw materials around the table. Don't fall into a state of overconfidence after having done lots of work, but allow yourself to adjust forecasts decisively for any differences between this situation and others.
3) To improve your forecasting, start with humility - understand you need to incorporate all available evidence and constantly keep score.
Thank you for joining. Next week is blog 21 and will be 'Highlights from 20 blogs about decision-making'.
Comments