The Black Swan explains why we are so bad at predicting the future, and how unlikely events dramatically change our lives if they do happen, as well as what you can do to become better at expecting the unexpected.
The only point of critique I have to make about Nassim Nicholas Taleb is that I wish he’d publish more books faster. The way he thinks is marvelous and there’s so much to learn from him. This book is the second volume of his 4-volume body of work on uncertainty, called “Incerto.” Antifragile is the fourth.
First of all, this has nothing to do with the (in my opinion good) movie Black Swan. This book is about statistics, probability, and how we often falsely use those to estimate the likelihood of real-life events.
One of my friends studied economics in college. He didn’t like it. He always said: “All of those models don’t stand a chance in the real world. They’re too narrow, none of this stuff is actually realistic.” Yet, we use narrow models to predict reality all the time – which is why we’re so bad at it.
Here are 3 lessons from The Black Swan to help you get better at expecting what others don’t:
- Because Black Swans are always unexpected, they dramatically change the world of those, who are not prepared for them.
- Never try to explain the future by looking at your past,it’s a bad indicator.
- If you try to gauge real-world risk like you would in a game of cards, you’ll likely make bad decisions.
Are you prepared to find out what you don’t know? We’ll see!
Lesson 1: Black Swans dramatically change the reality of those, who aren’t aware that they’re coming.
Nassim Taleb calls an event a “Black Swan” if it’s unpredictable not because it’s random, but because our outlook on what’s possible was too narrow. The name stems from the fact that up until 1697, mankind believed all swans were white. But when Dutch explorers finally saw black swans for the first time in Western Australia, the term morphed into describing an event that occurred in spite of seeming impossible.
As a logical consequence, those who are the least aware of a Black Swan coming, will suffer the most from its often already extreme consequences.
Imagine you’d known about the 9/11 attacks, the 2008 financial crisis or hurricane Katrina in advance. You wouldn’t have been shocked and surprised. In some cases, a Black Swan is only a tragedy for a single person.
For example if John bets on his favorite horse Onyx at the racing track, because he knows that Onyx is in great health, has a solid track record, a skilled jockey and poor competition, John will surely be devastated if he loses all his money, because when the race starts, Onyx doesn’t move an inch and instead lies down.
But what’s a Black Swan for John can be the deal of a lifetime for Tony, the owner of the horse, who’s known in advance that Onyx would protest and bet against his own horse.
However, often Black Swans affect entire societies, or even the whole world. Just think of Copernicus’s discovery that the sun is the center of the universe, not the earth, or when Neil Armstrong set foot on the moon.
Lesson 2: Don’t use your past to explain the future.
One of our biggest erroneous behaviors is our tendency to predict what will happen in the future by using our past as an explanation. Based on the only things we can be certain of – what has happened in our lives in the past – we weave a narrative that makes sense and expect that the future simply must unfold this way.
But there are many unknown factors that could change it.
For example, imagine you’re a turkey and for years you live on a farm, get to roam free every day and are fed great food by a farmer. Why would you expect anything to change? But if tomorrow’s thanksgiving, you’re just 24 hours away from getting killed, stuffed and roasted.
The very same thing happened to everyone who lost a lot of money during the financial crisis – people believed markets would go up forever, because that’s all they had done for years, and when they suddenly didn’t, people were surprised.
Lesson 3: Trying to assess real-world risk like you would in a game can lead you to making the wrong choices.
Another fallacy Taleb describes is called the ludic fallacy. This one explains why we do such bad jobs at getting the right insurance policy, for example.
When faced with the task to assess risk in the real world, we usually try to imagine the risk like a game, where there’s a set of rules and probabilities that we can determine up front, in order to then make the right decision.
However, very often, this isn’t possible. You can’t just add all the probabilities for getting certain diseases or having a particular accident and then say: “Okay, based on this, I’ll get insured for X amount.”
For example, if you observe a coin flip game, where the dealer tells you the coin is fair (i.e. lands on heads or tails 50:50), but for 99 times in a row it comes up heads, would you really believe the odds are still 50:50 on the next toss?
Statistically speaking, the odds haven’t changed, but any reasonable person would assume that the coin is rigged and bet heads. If unlikely events happen more often than they statistically should, you must question the assumptions of your model.
For example, casinos have a lot of security against robberies and throw out suspicious players. But maybe, a much bigger risk is that someone kidnaps the owner’s daughter to blackmail him, or that an employee forgets to file taxes, leading to a huge criminal investigation.
It’s hard for us to assess risk accurately in the real world, but oversimplifying it only makes it worse.
My personal take-aways
Wow, I could’ve shared a ton more lessons. This book is fantastic. Really makes you aware of how your brain fails you, and as always, that awareness is the first step. An absolute must read!
Buy this book- https://amzn.to/2BGBQUQ