Book Review The Innovators : How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution

By Brian Christian

By Walter Isaacson

“It’s in Apple’s DNA that technology alone is not enough – that it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our heart sing.”
–Steve Jobs

If you have not yet read Walter Isaacson’s previous book, “Steve Jobs,” I highly recommend it. It combines a fascinating subject (Jobs), a compelling storyline (the Apple birth, death and rebirth) and a journalist’s rare insight into the mechanics of how business works in Silicon Valley — and more broadly in America. I cannot make the same confident recommendation for “The Innovators.”

Isaacson’s latest work reads more like a history text, but not without a healthy dose of interesting anecdotes. There are certainly individuals who will be thrilled – probably 20thcentury American history buffs and digital industry professionals. But for students of innovation like myself, it is a bit of a slog for a few important but not new lessons more easily gained elsewhere. Admittedly, it may just be a problem of (my) heightened expectations based on his previous book.

This book is a chronological telling of the digital revolution, starting in early 19th century England with the conception of an automated calculator to the emergence of Google in the early 21st century. It tells the story of the hundreds of brilliant scientists and entrepreneurs who created each of the small and large building blocks along the way – including semiconductors, transistors, microprocessors, mainframes, personal computers, software, the Internet and search engines. Isaacson does a wonderful job of telling the inventors’ stories, their amazing triumphs and their foibles and failures. He was on a mission to highlight the real contributors, individually and collectively, not necessarily the few who got (or took) the credit. I trust that he got it right.

Isaacson has woven four principal themes into his telling of the history of the digital revolution, in approximate order of the author’s emphasis:

  1. Technology-driven innovation on the order of the digital revolution results more often from the collaborative and complementary efforts of hundreds and thousands of smart and dedicated individuals making incremental advances than from the lone inventor with the earth-shaking epiphany. Both are important but, too often, history tends to exaggerate the lone inventor’s role, leaving humanity with an under-appreciation of the importance of the collaborative mindset and process.
  2. The most transformative visionaries and innovators in the digital revolution were strong believers in the importance of both the sciences and the arts. Isaacson makes a strong argument that scientists who reject the importance of the arts and humanities do so at their own expense. He also suggests that it is this ability to meld the arts and sciences that will separate humans from artificial intelligence. The Jobs quotation at the top of this review highlights this theme.
  3. Women had a critical and widely under-reported role in the early days of the development of computers and operating systems. In fact, their role may have been greater in the first half of the 20th century than in the second half. His explanation for this is that women were very well accepted into the field of mathematics in earlier decades. At the same time, math was widely regarded as an important skill set in the early development of computers and operating systems. In latter decades, both trends reversed.
  4. From its very earliest days, digital innovation has advanced rapidly due to the predominant ethic among the digital scientific community — that intellectual property should be shared for the common good rather than protected for economic gain. Isaacson consistently calls out those who leaned toward the closed/profit model vs. the open/common good model. He also offers an interesting distinction – that the capital-intensive parts of the digital ecosystem, such as chip-making, justifiably lean toward the closed model to ensure capital investors receive a fair return while the less capital-intensive parts of the ecosystem, such as software development, should lean toward the open model for the benefit of humanity.

In summary, it’s a prodigious work and perhaps a needed chronicle of the most important scientific revolution of the last century. It’s not exactly my cup of tea but, if you are a 20th century history buff or a digital industry professional, it may very well be yours.