Superfreakonomics reveals how you can find non-obvious solutions to tricky problems by focusing on raw, hard data and thinking like an economist, which will get you closer to the truth than everyone else.
SuperFreakonomics is the follow-up book to the insanely popular Freakonomics, published in 2009, by Steven Levitt and Stephen J. Dubner. Like all of their co-authored books, it takes an economic approach to what’s going on in the real world, which means using statistics and hard data to find out what really drives human behavior.
This book tackles interesting topics like prostitution, terrorism and global warming. However, the lessons I want to share with you are on a more general level, because I think that will help you the most to really embrace the ideas of this book.
Here are 3 lessons to help you see the world as clearly as possible:
- Incentives rarely work out as planned.
- You can find simple solutions to tricky problems by zooming out.
- There’s no such thing as too much data. Always collect asmuch as you can.
Do you think predicting human behavior is tough? Here’s a filter that’ll help you find the truth more often than ever!
Lesson 1: Even the best incentives don’t work out as planned and always come with side effects.
Even though we can’t see into peoples’ heads and just look at what exactly makes them tick, we’re still driven by the power of incentives all the time. Governments, companies, schools, even just other people constantly try to get us to do things by dangling certain rewards in front of us.
The idea is simple enough: You promise someone reward B for performing action A and hope that everyone in your target group shows the desired behavior.
However, there’s a hidden force at play here: the law of unintended consequences. Dubner and Levitt have coined this phrase to describe the behaviors that occur after giving an incentives that weren’t planned.
For example, in Germany, the government keeps trying to get people to produce less waste with fun ideas like picking up trash only once every three months, downsizing trash cans or introducing volume-based fees. So far, all of these ideas have backfired horribly, leading only to creative ideas on how to avoid the new systems, for example by dumping trash in the woods or flushing food down the toilet.
But even the incentives that do work won’t do so for everyone, and all of them will have some side effects. You might get your kids to do their dishes if you promise them $1 for every cleaned plate, but that might also lead them to clean them even when they aren’t dirty or expect money for other household chores.
Don’t forget: all incentives have intended and unintended consequences!
Lesson 2: Simple solutions to tricky problems are often hidden on a more generic level.
What do scientists do if they can’t find the solution to a problem? They collect data! Data always helps you find a solution, but it might not do so in the way that you think.
For especially difficult problems, the solution often lies one level above the realm of the issue itself. A single data point, an extreme value or outlier, or the data points you thought you’d collect, but didn’t, often tell a much more revealing story than the “normal” data.
This is related to a phenomenon called omitted variable bias, which means you’ve forgotten to even include one of the most important factors in your analysis. For example, there’s a correlation between a country’s chocolate consumption and its number of Nobel prize laureates, but that doesn’t mean eating chocolate makes you more likely to win a Nobel prize. It just so happens that wealthier countries with a higher level of education also spend more money on luxury foods.
Similarly, it’s sometimes easier to find solutions that prevent a problem of occurring in the first place, rather than solve it after it is present.
For example, Ignaz Semmelweis discovered how to reduce maternal mortality after giving birth when comparing his hospital, which had an autopsy department, with another one, which didn’t. Without germ theory even being developed, he figured something must happen during autopsies that gets young mothers infected and thus advised doctors to wash their hands – which worked like a charm.
In the same way, the car seatbelt was developed in the 1950s: why not protect the head from being flung around in the first place, rather than trying to make it land soft upon collision?
So when you face a complex problem, zoom out, take a step back and look outside the realm of standard data.
Lesson 3: You can never have enough data, so always collect as much as you can.
Lastly, since the most valuable pieces of data are the ones you rarely observe, plus those that you find are missing from the normal picture, you need a lot of data to find them.
The more data you have, the more counter-intuitive findings will emerge, so always collect as much data as you can.
For example, with Four Minute Books I’ve built in plenty of places where people can leave feedback, such as the book suggestion form, the prompt to reply to my very first email, plus integrated surveys into email sequences, like Time 2 Read.
That way, I’m always collecting more data on autopilot, which I can then combine with specific questions at certain points in time, for example the survey I sent out before creating Time 2 Read in the first place.
Data never hurts, to the contrary, so make sure you’re always getting more of it!
My personal take-aways
Statistics is a topic that really speaks to me in 2016. There’s something about reducing issues to numbers that forces you to think rationally and look at what’s actually going on, not guess based on your own, muddled feelings and intuition. In a hyper-sensitive, trash-news driven world, whoever can do that the best wins, and I really want you and I to win – so I’m recommending this book to you.