Black Swans, and How to Avoid Them


Black Swans, and How to Avoid Them

Nassim Nicholas Taleb’s The Black Swan explores the nature of what we perceive as random events, and the logic pitfalls that cause us to miss out on the bigger picture. He calls these seemingly random events, which often have profound consequences for the individual, Black Swans. Black Swans are events thought to lie outside the realm of possibility, and which happen anyway. This concept has tremendous implications for decision making. It primarily exposes the shortcomings of our intuition in making predictions and the dishonest narratives that we tell ourselves regarding our own experience. This blog will elaborate on this idea by identifying two specific blindspots we all share, and simple ways we can resolve them.
By Jeff Dickson
No matter how many white swans we come across, the finding of just one black swan will lead to a surprising rejection of our universal belief that ‘all swans are white’.
— Nassim Nicholas Taleb
Nassim Taleb.jpg

The Ludic Fallacy of Trusting our Intuition and Experience

We use our intuition to identify problems, diagnose cause and effect, and make predictions all the time. The problem is that we’re actually terrible at intuiting; and we’re only getting worse. As complexity and the pressures of speed in today’s non-routine work environment increase, we continue to place far too much confidence in our intuition and underestimate our ignorance. This over-reliance on past methods and over-confidence in our intuition and experiences is called theLudic Fallacy. It makes us think that we are fully aware of all of the risks and probabilities, as if reality is like a game of cards or dice. Only in hindsight do we realize reality is nothing like a game. Unfortunately, rather than making this admission we, instead kid ourselves by using hindsight to explain away our past failures and missed opportunities. The fact is, we never saw that black swan coming, and it changed our view of the world dramatically. It is just another indication that we are absolutely pitiful at both making predictions of the future and establishing causes for the present, but can that change? Can we eliminate the black swan surprises that stem from over-confidence in our experience and intuition? I believe the answer is yes…mostly. Although we can’t predict everything, we can eliminate the two most glaring areas of blindness—blindness to the obvious (i.e. a lack of knowledge) and blindness to our blindness (i.e. bias).

“We are blind to the obvious and blind to our blindness”
— Daniel Khaneman
Daniel Khaneman,  Israeli-American psychologist and 2002 Nobel Memorial Prize in Economic Sciences

Daniel Khaneman, Israeli-American psychologist and 2002 Nobel Memorial Prize in Economic Sciences

Blind to the Obvious

In regard to blindness to the obvious, the concept is simple. The more information you have, the less likely you are to be hit by a surprising Black Swan; and the more ignorant you are, the more you are at risk. Imagine making a bet on your favorite horse, Rocket. Because of Rocket’s build, her track record, the skill of the jockey, and the poor competition, you believe that Rocket is the safest bet and gamble everything you own on the horse to win.

Now imagine your surprise when the starting pistol is fired and Rocket not only doesn’t leave the gates but opts instead to simply lie down on the track. This would be a Black Swan event. Given the information you’d gathered, Rocket’s winning was a safe bet, yet you lost everything the instant the race began.

But this event is not a tragedy for everyone. For example, Rocket’s owner made a fortune by betting against his own horse. Unlike you, he had additional information, knowing that Rocket was going to go on strike to protest animal cruelty. Just that small amount of information saved him from having to suffer a Black Swan event.

Blind to Our Blindness

At the same time, even when we have information about how the world works, we tend to cling to it; and that’s not always good. We use narrative fallacies, that is our ability to give meaning to unconnected bits of information, in order to organize the information of the past. We use hindsight bias, to reorganize events in order to support our beliefs today, and we use confirmation bias to maintain our beliefs in the future. These biases can be harmless in some instances, but catastrophic in others. The book Black Swan reminds us of the investor who had experience and statistics limited to the period 1920-28 – ending only one year before the greatest stock market crash in US history. Over that period, he had observed a few small dips and peaks, but in general he noticed that the trend of the market was upward. So, thinking that this trend must continue, he spent his life savings on stocks. The next day, the market crashed and he lost everything.

Opening our Eyes and Preventing Black Swans in 3 Steps

1. Admit it. You’re biased

Simply knowing what you don’t know is a powerful first step in avoiding Black Swan moments. It is the best defense against falling into the cognitive traps and fallacies we use to make sense of the world.Knowing that you are subject to cognitive bias, like everyone else, tmakes it much easier to recognize when you’re only looking for information that confirms what you already believe to be true. Likewise, knowing that we humans like to organize everything into neat, causal narratives, and that this kind of approach simplifies the complexity of the world, makes you more likely to search for further information to gain a better view of the “big picture.”

2. Add a Framework

At ALLOY, when we recognize a good strategy or way of thinking being leveraged, we simply ask the question, “What is the framework you are using?”In contrast, if a framework is missing from the conversation, we simply go looking for one in order to add to our knowledge base and/or fight our natural bias. This is a great way to remove blindness from a conversation and learn new information all at the same time. As a bonus, if there is a simple framework that can be leveraged, it is typically far easier to train others versus forcing the adoption of an individual’s personal preference.

Edward de Bono,  Maltese physician, psychologist, philosopher, author, inventor, most known for lateral thinking and the book Six Thinking Hats

Edward de Bono, Maltese physician, psychologist, philosopher, author, inventor, most known for lateral thinking and the book Six Thinking Hats

No way of looking at things is too sacred to be reconsidered. No way of doing things is beyond improvement.
— Edward de Bono

3. Go wider, before you go deeper

Finally, rather than feeding our desire to see events in clear-cut cause and effect, or reducing our knowledge base to one tried and true framework, it’s better to instead consider a number of possibilities without being married to any single one (see our article on rapid option making). Edward de Bono calls this Lateral Thinking—a manner of solving problems using a creative approach that encourages multiple viewpoints rather than traditional vertical logic. Too often, when we rely on our intuition to do this, we find ourselves producing ideas that are similar to or derivative of previous ideas. For this reason, we suggest using multiple frameworks as stimuli to a much broader level of creativity. This way, not only is less bias is permitted, but the ideas are uniquely different, and many times they are better ideas altogether. For instance, imagine using one simple framework for setting business goals such as SMART goals. While you may avoid intuitive errors like forgetting to make the goal specific and measurable, another framework, such as the Hedgehog Concept could add a new viewpoint and lead to a much more creative and stronger goal.