Everything Is Obvious shows you that common sense isn’t as reliable as you think it is, because it often fails us in helping to make predictions, and how you can change the way you or your company make decisions with more scientific, statistically grounded methods.
Duncan J. Watts is a principal researcher at Microsoft Research, exploring social phenomena in an online world. He’s most famously known for replicating an experiment Stanley Milgram did in the 1960s, called the small-world experiment.
Milgram sent 300 letters to random people in the United States, all with the goal of being forwarded to the same person in Boston. Participants received a description of who the letter was intended for and were prompted to pass it along. If they didn’t know Alice, the target person, they should just hand it to someone they thought might be a step in the right direction.
As it turned out, it only took six connections on average, indicating that most people in the US were connected through a maximum of six people. You might know this as “six degrees of separation.” Watts replicated this experiment with 60,000 people from 166 countries and 24,000 email chains in 2003, shedding new light on the matter, but confirming the six degrees theory.
Would you have thought it only takes six people to reach anyone in the world? Not exactly common sense is it? It is exactly these kinds of phenomena Watts explores in this book.
Here are 3 lessons from Everything Is Obvious:
- Common sense doesn’t account for cognitive biases, which makes it unreliable.
- Focus on the present and react, rather than trying to predict the future.
- Build uncommon sense using the scientific method to make better decisions.
Ready to find out why common sense is neither common nor helpful? Let’s rewire your decision-making software!
Lesson 1: If you rely on common sense, you’re not accounting for the cognitive biases at play.
I’m an organ donor. I have this little paper card in my wallet, on which I’ve checked some boxes and signed my name. In the case of my (medically confirmed) death, doctors can take out my liver, lungs or heart and give them to someone else, if it helps them.
This little card makes me an exception, because here in Germany, only 12% of people agree to being an organ donor each year. Our Austrian neighbors, however, boast a staggering 99.9% organ donor rate. How the hell is that even possible?
Simple: In Austria, being an organ donor is the default. You don’t have to get a card and you don’t have to opt in. Only if you opt out are you not an organ donor, not the other way around.
Sticking with the default is one of the biases at play in our decision-making, which make relying on common sense unstable. Two other, but similar biases are priming and anchoring.
Priming means exposing you to certain stimuli to influence your later decisions. For example, if you read a long text about old people, including words like “slow”, “frail”, “lethargic”, “stagnant” and “sluggish”, you’re likely to walk a lot slower afterwards. It’s like that joke: Get a friend to say “milk” ten times in a row and then ask them what the cow drinks (the answer is water, btw).
Anchoring is similar to the default bias: When I tell you that the “suggested” donation to my charity is $50, you’ll likely settle for an amount close to that, no matter whether you originally intended to donate just $10 or you know the average is a lot higher.
Lesson 2: Instead of trying to predict the future, stay in the present and work with what you’ve got.
As you can see, predicting human behavior is really tough, given there are not just the three above, but dozens of other biases at play. Therefore, relying on common sense is a bad strategy, especially when making important decisions, for example concerning business strategy.
Have you ever considered abandoning predicting altogether and just living, deciding and reacting to the present, based on your own observations?
Zara has. The Spanish clothing company uses an approach called “measure and react.” They look at what their customers are already wearing, create new styles based on that, and then test small samples of new items in various stores to get feedback. This allows them to quickly see what works and what doesn’t, after which they only have to remove the shelf warmers and produce more of the bestsellers.
The next best source of feedback after customers is whoever else will be affected by your changes – for example if you want to re-vamp your hiring process, guess who you should ask? The people who work in HR! They already know what’s working and what isn’t and can tell you what needs to be changed. This is called local knowledge and you should use it whenever you can.
Lesson 3: Make better decisions by building uncommon sense, which relies on the scientific method.
If the two methods above kind of reminded you of lab experiments, then you’re on the right track. Both of them rely on something called the scientific method. Here’s how it works: you create a hypothesis (this blue scarf will sell well), then collect data to back up or disprove that hypothesis (track sales for a month in November) and finally adjust your hypothesis or draw general conclusions (blue scarves sell well in November).
Especially when people are involved, common sense will always tell you that you “know” the answer. Most of the time, your common sense is wrong. This scientific approach goes against the grain, it’s very much uncommon sense, but it’ll get you far better results in the long run.
And thanks to the internet, you now have access to more data than ever. Companies like Facebook, Palantir or Google provide vast amounts of it to anyone – all you have to do is use it!
My personal take-aways
Definitely a hidden champion this book. I believe cognitive biases are one of the most important topics to learn about, and combined with taking a more scientific, yet creative approach to making decisions, it’s especially powerful. Everything Is Obvious teaches you about both those things, so I highly recommend you check it out.