The Innovators walks you through the history of the digital revolution, showing how it was a combined effort of many creative minds over decades, that enabled us to go from huge, clunky machines to the fast, globally connected devices in your pocket today.
When a world-class biographer goes on a stroll through history, good things are bound to happen. Walter Isaacson has written stellar biographies of Steve Jobs, Benjamin Franklin and Albert Einstein. But in 2014, he decided to tackle a slightly different project. He wanted to give a snapshot of the history of our digital revolution, and so he went all the way back to the beginning, picking up threads as he saw them and sowed them together to make this beautiful book.
Above all, The Innovators will show you that true innovation is rarely one single individual’s effort, as it’s based on collaboration, integration and incremental improvements. For hundreds of years, people have put their blood, sweat and tears into their work, so you can read this summary today.
Here are 3 lessons to highlight the inflection points of computer technology:
- The first programmer was a woman, and her program was aresult of math and poetry.
- Hippies and hackers made the computer personal.
- The internet was a combined effort by universities, the militaryand private companies.
Let’s learn about some of the greatest minds in technology!
Lesson 1: The first programmer was 100 years ahead of her time, and she was a poet and mathematician.
We all know computers are somehow rooted in mathematics, but few people know just how much. In the early 1800s, Ada Lovelace, daughter of Lord Byron, furiously studied both mathematics and art, as she had a burning passion for one and felt the other helped discipline herself.
However, her creative genius only really came to fruition when she started attending English polymath Charles Babbage’s weekly salons about science and technology, where great minds came together. Star of the show was Babbage’s Difference engine, a mechanical machine that could calculate polynomial functions (but would take forever to build).
Lovelace later used her sense for poetry and mathematical ability to expand upon an improved version of the Difference engine, the Analytical engine. This machine would be able to process different problems and even switch between what to solve on its own. When translating a transcript of Babbage’s description, Lovelace added her own notes, which ended up being twice as long, much more valuable and would describe the first computer program.
Essentially, she described computers as we know them, versatile general-purpose machines, in 1843.
Lesson 2: The personal computer was invented by hackers and hippies.
You may be aware that in the early days of software, a fierce competition between Apple and Microsoft took place. Yet, both of them borrowed from a company named Xerox and so really, the first operating systems were a “combined effort.” But not just that, hardware was a group effort too.
The humble beginnings of the personal computer can be traced back to the Homebrew Computer Club, where the hippies that took apart devices in the 60s to understand them, now became hackers in the 70s, who tried to build them. From 1975 onwards, the club met bi-weekly and it’s where tech nerds Steve Jobs and Steve Wozniak first learned about the Altair 8800, the first computer hobbyists could use.
Discussion about it at the club was lively, and once orders for $397 went through the roof after a feature on the cover of Popular Electronics magazine, Jobs and Woz knew they were on to something with Apple as well.
Lesson 3: Universities, the military and private companies came together to create the web.
Some people are critical connections in any given process, with major responsibility for the eventual outcome. Vannevar Bush is one of those people. The dean of MIT’s School of Engineering, top military science advisor and founder of computer company Raytheon brought the three crucial parties behind the internet together.
The National Science Foundation, which brings together experts from various industries, backgrounds and companies for basic research, is a direct result of his plea to the government. J.C.R. Licklider shared his vision of decentralized networks and real-time human-to-computer communication there, which Bob Taylor formed into a collaborative research network for universities, which Larry Roberts helped build.
Later, all of their ideas merged into the military and academic network ARPANET in 1969, which, in 1973, first opened to more and more facilities, thanks to the development of the Internet Protocol. Thus, the foundation of the internet as we know it rests on many peoples’ shoulders.
As you can see, progress doesn’t happen overnight or behind closed doors. It’s only when people come together, share, collaborate, create and negate that ideas will eventually amount to something that can change the world.
My personal take-aways
I think the big, big idea that hovers above all the lessons from this book is that if you’re not willing to share your ideas, your thoughts, your work with other people, none of it will ever be of any significance. Whether you pursue something solo or as a team, there’s always a time to come out and say: “Hey world, I made this, what do you think?” Unless you do that, you can’t truly call yourself an innovator. And that’s as true for innovation in technology as it is for art, education, business, politics, social sciences, or any other field. Walter Isaacson is a great writer, I highly recommend his work
Buy this book– https://amzn.to/2Eg2Wnu