The Black Swan Summary

Categories EconomicsPosted on

The Black Swan explains why we are so bad at predicting the future, and how unlikely events dramatically change our lives if they do happen, as well as what you can do to become better at expecting the unexpected.

The only point of critique I have to make about Nassim Nicholas Taleb is that I wish he’d publish more books faster. The way he thinks is marvelous and there’s so much to learn from him. This book is the second volume of his 4-volume body of work on uncertainty, called “Incerto.” Antifragile is the fourth.

First of all, this has nothing to do with the (in my opinion good) movie Black Swan. This book is about statistics, probability, and how we often falsely use those to estimate the likelihood of real-life events.

One of my friends studied economics in college. He didn’t like it. He always said: “All of those models don’t stand a chance in the real world. They’re too narrow, none of this stuff is actually realistic.” Yet, we use narrow models to predict reality all the time – which is why we’re so bad at it.

Here are 3 lessons from The Black Swan to help you get better at expecting what others don’t:

  • Because Black Swans are always unexpected, they dramatically change the world of those, who are not prepared for them.
  • Never try to explain the future by looking at your past,it’s a bad indicator.
  • If you try to gauge real-world risk like you would in a game of cards, you’ll likely make bad decisions.

Are you prepared to find out what you don’t know? We’ll see!

Lesson 1: Black Swans dramatically change the reality of those, who aren’t aware that they’re coming.

Nassim Taleb calls an event a “Black Swan” if it’s unpredictable not because it’s random, but because our outlook on what’s possible was too narrow. The name stems from the fact that up until 1697, mankind believed all swans were white. But when Dutch explorers finally saw black swans for the first time in Western Australia, the term morphed into describing an event that occurred in spite of seeming impossible.

As a logical consequence, those who are the least aware of a Black Swan coming, will suffer the most from its often already extreme consequences.

Imagine you’d known about the 9/11 attacks, the 2008 financial crisis or hurricane Katrina in advance. You wouldn’t have been shocked and surprised. In some cases, a Black Swan is only a tragedy for a single person.

For example if John bets on his favorite horse Onyx at the racing track, because he knows that Onyx is in great health, has a solid track record, a skilled jockey and poor competition, John will surely be devastated if he loses all his money, because when the race starts, Onyx doesn’t move an inch and instead lies down.

But what’s a Black Swan for John can be the deal of a lifetime for Tony, the owner of the horse, who’s known in advance that Onyx would protest and bet against his own horse.

However, often Black Swans affect entire societies, or even the whole world. Just think of Copernicus’s discovery that the sun is the center of the universe, not the earth, or when Neil Armstrong set foot on the moon.

Lesson 2: Don’t use your past to explain the future.

One of our biggest erroneous behaviors is our tendency to predict what will happen in the future by using our past as an explanation. Based on the only things we can be certain of – what has happened in our lives in the past – we weave a narrative that makes sense and expect that the future simply must unfold this way.

But there are many unknown factors that could change it.

For example, imagine you’re a turkey and for years you live on a farm, get to roam free every day and are fed great food by a farmer. Why would you expect anything to change? But if tomorrow’s thanksgiving, you’re just 24 hours away from getting killed, stuffed and roasted.

The very same thing happened to everyone who lost a lot of money during the financial crisis – people believed markets would go up forever, because that’s all they had done for years, and when they suddenly didn’t, people were surprised.

Lesson 3: Trying to assess real-world risk like you would in a game can lead you to making the wrong choices.

Another fallacy Taleb describes is called the ludic fallacy. This one explains why we do such bad jobs at getting the right insurance policy, for example.

When faced with the task to assess risk in the real world, we usually try to imagine the risk like a game, where there’s a set of rules and probabilities that we can determine up front, in order to then make the right decision.

However, very often, this isn’t possible. You can’t just add all the probabilities for getting certain diseases or having a particular accident and then say: “Okay, based on this, I’ll get insured for X amount.”

For example, if you observe a coin flip game, where the dealer tells you the coin is fair (i.e. lands on heads or tails 50:50), but for 99 times in a row it comes up heads, would you really believe the odds are still 50:50 on the next toss?

Statistically speaking, the odds haven’t changed, but any reasonable person would assume that the coin is rigged and bet heads. If unlikely events happen more often than they statistically should, you must question the assumptions of your model.

For example, casinos have a lot of security against robberies and throw out suspicious players. But maybe, a much bigger risk is that someone kidnaps the owner’s daughter to blackmail him, or that an employee forgets to file taxes, leading to a huge criminal investigation.

It’s hard for us to assess risk accurately in the real world, but oversimplifying it only makes it worse.

My personal take-aways

Wow, I could’ve shared a ton more lessons. This book is fantastic. Really makes you aware of how your brain fails you, and as always, that awareness is the first step. An absolute must read!

Buy this book- https://amzn.to/2BGBQUQ

Superfreakonomics Summary

Categories EconomicsPosted on

Superfreakonomics reveals how you can find non-obvious solutions to tricky problems by focusing on raw, hard data and thinking like an economist, which will get you closer to the truth than everyone else.

SuperFreakonomics is the follow-up book to the insanely popular Freakonomics, published in 2009, by Steven Levitt and Stephen J. Dubner. Like all of their co-authored books, it takes an economic approach to what’s going on in the real world, which means using statistics and hard data to find out what really drives human behavior.

This book tackles interesting topics like prostitution, terrorism and global warming. However, the lessons I want to share with you are on a more general level, because I think that will help you the most to really embrace the ideas of this book.

Here are 3 lessons to help you see the world as clearly as possible:

  • Incentives rarely work out as planned.
  • You can find simple solutions to tricky problems by zooming out.
  • There’s no such thing as too much data. Always collect asmuch as you can.

Do you think predicting human behavior is tough? Here’s a filter that’ll help you find the truth more often than ever!

Lesson 1: Even the best incentives don’t work out as planned and always come with side effects.

Even though we can’t see into peoples’ heads and just look at what exactly makes them tick, we’re still driven by the power of incentives all the time. Governments, companies, schools, even just other people constantly try to get us to do things by dangling certain rewards in front of us.

The idea is simple enough: You promise someone reward B for performing action A and hope that everyone in your target group shows the desired behavior.

However, there’s a hidden force at play here: the law of unintended consequences. Dubner and Levitt have coined this phrase to describe the behaviors that occur after giving an incentives that weren’t planned.

For example, in Germany, the government keeps trying to get people to produce less waste with fun ideas like picking up trash only once every three months, downsizing trash cans or introducing volume-based fees. So far, all of these ideas have backfired horribly, leading only to creative ideas on how to avoid the new systems, for example by dumping trash in the woods or flushing food down the toilet.

But even the incentives that do work won’t do so for everyone, and all of them will have some side effects. You might get your kids to do their dishes if you promise them $1 for every cleaned plate, but that might also lead them to clean them even when they aren’t dirty or expect money for other household chores.

Don’t forget: all incentives have intended and unintended consequences!

Lesson 2: Simple solutions to tricky problems are often hidden on a more generic level.

What do scientists do if they can’t find the solution to a problem? They collect data! Data always helps you find a solution, but it might not do so in the way that you think.

For especially difficult problems, the solution often lies one level above the realm of the issue itself. A single data point, an extreme value or outlier, or the data points you thought you’d collect, but didn’t, often tell a much more revealing story than the “normal” data.

This is related to a phenomenon called omitted variable bias, which means you’ve forgotten to even include one of the most important factors in your analysis. For example, there’s a correlation between a country’s chocolate consumption and its number of Nobel prize laureates, but that doesn’t mean eating chocolate makes you more likely to win a Nobel prize. It just so happens that wealthier countries with a higher level of education also spend more money on luxury foods.

Similarly, it’s sometimes easier to find solutions that prevent a problem of occurring in the first place, rather than solve it after it is present.

For example, Ignaz Semmelweis discovered how to reduce maternal mortality after giving birth when comparing his hospital, which had an autopsy department, with another one, which didn’t. Without germ theory even being developed, he figured something must happen during autopsies that gets young mothers infected and thus advised doctors to wash their hands – which worked like a charm.

In the same way, the car seatbelt was developed in the 1950s: why not protect the head from being flung around in the first place, rather than trying to make it land soft upon collision?

So when you face a complex problem, zoom out, take a step back and look outside the realm of standard data.

Lesson 3: You can never have enough data, so always collect as much as you can.

Lastly, since the most valuable pieces of data are the ones you rarely observe, plus those that you find are missing from the normal picture, you need a lot of data to find them.

The more data you have, the more counter-intuitive findings will emerge, so always collect as much data as you can.

For example, with Four Minute Books I’ve built in plenty of places where people can leave feedback, such as the book suggestion form, the prompt to reply to my very first email, plus integrated surveys into email sequences, like Time 2 Read.

That way, I’m always collecting more data on autopilot, which I can then combine with specific questions at certain points in time, for example the survey I sent out before creating Time 2 Read in the first place.

Data never hurts, to the contrary, so make sure you’re always getting more of it!

My personal take-aways

Statistics is a topic that really speaks to me in 2016. There’s something about reducing issues to numbers that forces you to think rationally and look at what’s actually going on, not guess based on your own, muddled feelings and intuition. In a hyper-sensitive, trash-news driven world, whoever can do that the best wins, and I really want you and I to win – so I’m recommending this book to you.

Skin In The Game Summary

Categories EconomicsPosted on

Skin In The Game is an assessment of asymmetries in human interactions, aimed at helping you understand where and how gaps in uncertainty, risk, knowledge, and fairness emerge, and how to close them.

He must be among the top 10 most interesting accounts on Twitter: Nassim Nicholas Taleb. Besides lots of intellectual insights and witty tweets, he shares a lot of strong opinions, albeit weakly held. Having spent a great deal of his life examining how we can think more rationally, he is quick to argue for a cause, but not afraid to change his mind. Previous bestsellers from the former investment banker turned scholar and statistician include Fooled By Randomness, Antifragile and The Black Swan.

Taleb’s latest book, Skin In The Game, is a continuation of his Incerto series, which the past three books are part of, and thus a remix of past themes: rationality, uncertainty, statistics, economics, information, risk and morality. However, it is his most practical work to date, using anecdotes and analogies to highlight how gaps between these factors affect our everyday lives. By showing us hidden asymmetries and who’s got the most to lose in certain situations, Taleb helps us make better decisions and get the outcomes we want.

Here are 3 striking lessons from one of the clearest thinkers I know:

  • The minority often rules the majority.
  • How competent winners have to be depends on the industry wework in.
  • Rich people are easier to scam, because they have less tolose than the people selling to them.

If you’ve been scammed before, ended up with a low-quality product, or are fed up that everyone seems to get the better of you in negotiations, this summary is for you. Let’s see who really has skin in the game!

Lesson 1: It is not uncommon for the minority to rule the majority.

One of the many asymmetries Taleb has found lies in the consumption behavior of societies. It shows that often, whoever has the most skin in the game can win, even if the odds are stacked against them. This particular concept is called minority rule and it implies that societies will adapt demand for certain goods based on an inflexible minority, rather than what the majority wants.

For example, 70% of the lamb meat the UK imports from New Zealand is processed according to halal standards. However, only 4% of the population is Muslim, which is the main group that demands halal meat. How can such a small segment account for such a large share of imports? Well, minority rule is usually a result of the majority being flexible or indifferent. In this case, non-Muslims don’t mind eating halal meat over non-halal meat as there is no difference in taste.

The majority doesn’t care, so the minority gets what it wants. Similarly, genetically modified food firms have a hard time advertising. The products don’t have an advantage and most people are happy to eat non-genetically modified, so the small, but hardcore group of anti-GMO activists gets the better of the situation and drives consumption.

Lesson 2: Successful people don’t always have to be competent, because how we evaluate winners is industry-dependent.

If we’re honest, we’ve all used this excuse before, or at least thought of it: We claim someone got somewhere before us because they looked the part and we didn’t. While that may or may not have been true in your situation, Taleb claims the relationship between success and image isn’t the same in all industries. For professions where skin in the game is necessary to succeed, an awkward image can be a sign of someone who’s more prolific, but where it isn’t, subjective factors decide who will win.

For example, a lawyer who is extremely sought after, but dresses very sloppily, is bound to have proven herself in court over and over again. If she didn’t win cases, people wouldn’t hire her. The same is true for surgeons, authors whose book’s sell like crazy, and elite soldiers: without stellar results, they’d never have made it.

On the other end of the spectrum are CEOs, politicians, and bankers. While being good at what you do won’t hurt your chances here, the main prerequisite for ranking high up the ladder is being perceived as capable, because many votes – and thus people’s opinions – are involved in getting the part.

Lesson 3: When selling to rich people, the risk-reward-ratio of the transaction is off, which leads to more scams.

Every now and then I hear another story of an elderly person, who was scammed by some door-to-door salesman. Either the product was inferior, they grossly overpaid, or the seller sold them a suite of features they didn’t need. Besides the fact that we think and react slower as we get older, which can be an advantage in such a surprise sale situation, the main reason such con artists target older people is that they tend to have more money.

This extends to rich people in general. The more money you have, the easier it is to part you from some of it, especially if the portion is comparatively small to your overall fortune. In this case, the risk-reward-ratio is skewed in favor of the seller. They have a lot more skin in the game, because if they sell you a mansion for $4,000,000 rather than $1,000,000, they’ll get four times the commission. For someone worth $200,000,000, however, it doesn’t matter much whether they drop one million on a home or four.

As we get richer, the risk of spending more money on individual purchases goes down, and we’d rather be safe than sorry. That’s why we throw more dollars after trivial items than we should. Even those of us who aren’t millionaires.

My personal take-aways

Taleb is out for truth. He reasons from the ground up, relies on the scientific method, yet paints colorful pictures with stories that are never boring to read. With Skin In The Game, he’s doing his part to help our generation tackle important problems we care about, instead of veering off into careers with lots of dollars, but little responsibility. I think that’s a good thing.

Buy this bookhttps://amzn.to/2SIKRY4

Freakonomics Summary

Categories EconomicsPosted on

Freakonomics helps you make better decisions by showing you how your life is dominated by incentives, how to close information asymmetries between you and the experts that exploit you and how to really tell the difference between causation and correlation.

It never fails to baffle me how cheap learning has become. Get this: For $3.99 you could watch a 90-minute movie that details some of the best economic research of the 21st century with great animations, scenes and explanations. Ridiculous!

There are three main themes in the book. One is incentives, the issue of how we react to rewards and punishments. Another is information asymmetry, and which consequences arise from various gaps of knowledge and how we try to compensate for those. The last big theme is causation vs. correlation and how we often try to explain things the wrong way.

I decided to pick one lesson from each sector, so you’ll get a good idea of what the book’s overall message is. Here are 3 lessons to help you make better decisions:

  • Three kinds of incentives dominate your life.
  • Experts are often incentivized to exploit the fact that theyknow more than you.
  • Just because two things happen simultaneously doesn’t meanthat one causes the other.

Ready to adjust your choice barometer? Time for some freakonomics!

Lesson 1: There are three kinds of incentives that dominate your life’s choices.

Incentives have been dangled in front of your nose all of your life. From “if you finish your plate you’ll get some pudding” as a child to “if you sell 100 cars this quarter you’ll get a 25% bonus” all the way to “if you don’t stop harassing the cleaning lady, we’ll put you in a home, grandpa!”

An incentive is meant to get you to do more of a good thing or less of a bad thing, and is used by anyone who tries to influence your behavior.

Stephen Dubner and Steven Levitt say there are three kinds of incentives:

Economic – usually involving gain or loss of time and/or money.

Social  – when chances are you’ll look good in front of your peers or be isolated from them.

Moral – appealing to your conscience and inner drive to do the right thing.

The more types of incentives are combined, the more powerful the incentive gets.

For example, the disincentive (a negative incentive, the stick from the stick and carrot approach) to commit a crime is pretty strong. You could lose your job, house and personal freedom (economic), it is one of the most morally reprehensible things you can do (moral) and of course, you’d lose your friends and your reputation would go down the drain (social).

Lesson 2: Experts are often incentivized to abuse that they know more than you do.

In any transaction between humans, incentives are the driving force at play. So the moment you figure out what makes the person across the table act the way they do (and know the same for yourself), you can make better decisions.

Sadly, lots of systems incentivize us to cheat. Information asymmetry is one of these. We all need the help of an expert sometimes. When your knee hurts, you go to a specialist doctor, your hair is cut by a professional, and when you want to sell your house, you call a real estate agent.

For that last example, the system in place is a strong economic incentive – the agent gets a commission of the final sale price and should therefore try to maximize the selling price just as much as you, right?

Well, some simple math reveals that it’s often better for the agent to abuse that she knows more than you, and get you to sell quickly. If your agent can get you an offer for $100,000 within two weeks, and she gets a 10% commission, then that’s $10,000 in two weeks. Knowing it takes her another two weeks to get an offer for $120,000 results in just $2,000 more for her, or a 20% increase in income, at a 100% increase in time spent, bringing her total down to just $6,000 for two weeks.

Using the information she has that you don’t, she can get you to sell quicker and cheaper, so she can make more money. Studies have shown that when agents sell their own houses, they usually leave them up for sale a lot longer and also get higher prices.

Lesson 3: Just because two things happen simultaneously doesn’t mean that one causes the other.

When something like the above example happens, we usually think we’re smart and can put two and two together quickly. But more often than not, two and two ends up making five, because we’ve confused causation with correlation.

For example, if on the 31st of the Month a car dealer offers you a great deal on a car, you’d probably suspect that he only does so, so he can sell one more car, meet his quota and get a fat bonus, no matter how crappy the car is, right?

But you can’t possibly know what causes him to offer you that deal, just because the deal correlates with the last day of the month.

Maybe he’s made a commitment to himself to improve his selling skills and double his sales that month, and this is the last car he needs. Maybe he’s made a promise to his wife to sell enough cars to be able to afford day care for his newborn son. Or maybe he’s just happy that his boss gave him some wiggle room with prices, because they want to increase customer satisfaction and loyalty.

A very popular example of this is money’s influence on election outcomes. We all think whoever spends the most, gets the most votes. But actually, successful candidates could cut their budget in half and lose only 1% of voters (and vice versa losing candidates could double their budget and only get 1% more votes). In reality, voters simply want to influence a close match or back a clear favorite. Even though money is correlated with election outcomes, it’s not the cause of it.

My personal take-aways

I could’ve written 3 lessons for every blink here. This was one of the most well-executed summaries on there I’ve read so far. If I were to do it all over again I would read the summary, watch the movie, and then the entire book. So much to learn about making good decisions – this has the same life-changing capability The Paradox Of Choice does.

Doughnut Economics Summary

Categories EconomicsPosted on

Doughnut Economics is a wake-up call to transform our capitalist worldview obsessed with growth into a more balanced, sustainable perspective that allows both humans and our planet to thrive.

Kate Raworth is an Oxford economist, who suggests an update to 21st century economics, which accounts not just for our well-being and prosperity, but for that of our planet as well. Here’s the gist of her ideas in 3 lessons:

  • Our economy isn’t a closed market system.
  • To cover both our, as well as our planet’s needs, we can think of economics as a donut.
  • The first step to actually turning our economy into a circle is focusing on reusability.

Whether you’ve never studied economics or consider your self a member of the school of rational thought, here’s a new, useful model you canunderstand in just a few minutes!

If you want to save this summary for later, download the free PDF and read it whenever you want.

Lesson 1: The prevailing economic market model has 4 major flaws.

Among the thousands of economic diagrams, which humans have developed over centuries, one has really cemented itself into not just textbooks, but the backs of our brains: the circular flow of labor, capital, goods and services. It usually pictures two parties, firms and households, who exchange work for wages, rent, and dividends. The money then flows back to those firms as consumer spending on goods and services.

According to Kate Raworth, there’s only one problem with it:it’s wrong. She names four major factors that this diagram, which dominates economic education, neglects:

Ecological context. Our economy is embedded into the environment. We draw on the planet’s resources, like sun and water, and turn them into pollution and waste.

Parenting. Ungodly amounts of hours go into helping our children, caring for our family, and maintaining our households. All of them add to the economy, but aren’t accounted for anywhere.

Unpaid work. We’re social. We like doing things for others, even for free. Look at Wikipedia. Or Reddit. Or your simple neighborhood quid-pro-quo. But all of that adds value too.

Inequality. GDP growth is hailed as the holy grail all nations should strive for, but so far, it’s failed to eliminate inequality and, in most cases, has widened the gap between the rich and the poor.

But if our economic theories are really that outdated, how can we update them? Of course, Kate doesn’t just point out the problem, but also suggests a solution.

Lesson 2: The Doughnut model can help us maintain our social foundation without breaking through our planet’s ecological ceiling.

To account for the missing factors in classic economic theories, Kate proposes a model she calls ‘the Doughnut of social and planetary boundaries.’ It looks like this:

Doughnut Economics Summary Doughnut Model

Inside the doughnut hole lies our social foundation, which consists of 12 basic, human needs, like water, food, justice, or an education. Around the doughnut are nine planetary boundaries, which represent our ecological ceiling. If we overshoot on things like ocean acidification, land conversion, air pollution, and climate change, we’re hurting the planet to the point where it won’t be able to sustain us in the future.

Therefore, the ideal space for our economies to be in is the ‘dough’ of the doughnut, the space right between the social foundation and the ecological ceiling. As long as we’re in that safe and just space, we’re both satisfying our needs, as well as maintaining earth’s health.

We’ve already overshot in several important dimensions, which means so far, we’re not doing enough!

Lesson 3: In order to make our economy truly circular, we need to focus on maximizing the reusability of goods and services.

When I was little, my grandma had a special trash can at her house, with a logo consisting of three cyclic arrows on it. It was the recycling bin, and only a small selection of waste and wrapping materials went in there. Today, recycling is a global movement and that’s a good thing, Kate says.

In order to make our economy sustainable, it has to truly be circular, not just in some theoretic, dusted model. And making products, goods, and services reusable is a great place to start. Here are some of the initiatives going on around the world:

In San Francisco, composting is mandatory.

Argentinians turn trash into art.

Germany has long been recycling bottles and companies like BMW recycle their own cars.

This little fellow has made $21,000 from recycling at just seven years old.

As you read this, scientists are working on extremely useful ways of using our waste, like turning plastic into fuel. But until they crack some of these incredibly hard problems, there are lots of small steps every one of us can take. And most of them don’t require knowing much about economics at all.

My personal take-aways

It’s hard for idealistic ideas like this one to find their way into the everyday workings of capitalism, but I think the environmental perspective is the right one to take. Because while they’re hard to find and take long to figure out, ideas that are not just ecologically reasonable, but also highly profitable do exist. So whatever business project you tackle next, take a look at it through the lens of Doughnut Economics. Our planet might thank you for it.

error: Right click disabled