The stories we tell ourselves about genius and geniuses are often stories about history's great men. But Walter Isaacson's latest project is different.

A biographer in the strict sense, Isaacson wrote the hugely successful 2011 book on Steve Jobs following biographies of Einstein, Benjamin Franklin, and Henry Kissinger. (The fact that Jobs approached Isaacson to pen his authorized biography says something about the Apple founder's self-importance, too.)

The Innovators doesn't focus on any one individual, but offers a sweeping history of the digital revolution, and the curious partnerships and pulsing rivalries that inhabit it.

Advertisement

I spoke with Isaacson about his new book and the social issues that swirl around microchips and networks.

Sponsored

Q: The Innovators is a challenge to the Great Man Inventor narrative, and centers instead on innovative teams. How did your experience writing about Einstein and Franklin and those singular personalities change your ideas on how innovative groups work?

WI: When I looked at Steve Jobs, who was a singular personality, I realized that his vision would have been just a hallucination if he had not been able to create a team around him who knew how to execute on that vision. And it reinforced to me the fact that even though he was a singular and rather rough-edged individual, his great mastery was at putting together a very loyal team of collaborators who were fanatically loyal to him and to his vision and worked together creatively. And so, even with a visionary like Steve, it reminded me of the importance of collaboration and teamwork.

Q: Several times in the book you're careful to state that technologies associated with computers and networks don't have inherent values or logics to them. But the myth of an "open internet" —one that's transparent, egalitarian, efficient, and liberating continues to fuel debates over technology even as our personal data is collected, sorted, ranked, and sold by web companies and agents of the state. Talk to me about the problems with thinking about networked technology as beyond history and outside of politics.

WI: I think that any system that promotes the free flow of information and is open tends to be better in the long run at empowering individual aspirations and creativity. The internet is not perfect. Nor is any other system that's been invented. But it's certainly better than the cable TV model or other ways of distributing information.

Q: In the book we learn about the history of Intel and its collaborative, free-wheeling business culture as a reaction to East Coast hierarchy. How are these new team organizations—and the sprawling campuses of Google and Facebook that house them—better or worse than old-fashioned structured teams?

WI: I think that a hierarchical approach in which ideas are handed down doesn't work very well in the digital age. The internet was designed with peer to peer sharing ingrained into its DNA. The people who designed it did it collaboratively without a hierarchical structure and the distributed packet switched design makes it perfect for collaborative peer to peer shared effort.

I think that when you try a hierarchical structure you can see throughout the digital age people reacting badly to that whether it was Bob Noyce reacting against the East Coast executives at Fairchild or the way that Google and others created a much more open environment for ideas.

Q: Speaking of Intel, Re/code and others have reported that the company has (with great cowardice, I think) pulled ads from the website Gamasutra over complaints from customers who took issue with an article there, written by editor-at-large Leigh Alexander.

Alexander's article was part of broader criticism in video game journalism that focuses on the rampant, vicious, misogynistic behavior surrounding video games and public discourse. Why do you think this kind of abusive anti-feminism continues to survive?

WI: I think we can't try to control the free flow of information and whether it's on the telephone or the internet people are going to say dumb things, and perhaps even dangerous things, but I don't think we should blame the technology.

Q: In "Hate Crimes and Cyberspace ," University of Maryland Law Professor Danielle Citron advocates for a robust civil rights and criminal law agenda to better protect women, minorities, and everyone against cyberstalking, sexual harassment, and the publishing without one's consent of sexually explicit media. What are your thoughts on using legal and political solutions to tackle these issues and to curb bad actors and bad behavior?

WI: I'm against bad behavior but I'm weary of too many political solutions impinging upon the internet.

Q: Silicon Valley seems locked these days in doing incremental improvements on consumer products. Where are the next earth-shattering inventions like the integrated circuit, microprocessor, and personal computer? And will you pre-order your Apple Watch?

WI: Yes. I think the Apple Watch is extraordinarily cool. Because it gets to the first part of your question, which is, "Where is this all heading?" I think the narrative of my book is that instead of pursuing the mirage of artificial intelligence, in which machines will think without us, what's been particularly successful and will be in the future is making even more intimate connections between ourselves and our machines—having them much more embedded into our lives.

Certainly when I can just tap on my watch and order up an Uber car, I think that's a leap of innovation that we would have found startling a decade ago. I think that the great innovation has been in making our technology more personal and more social.

Q: Alan Turing is a fascinating figure. But in your discussion of AI you point to the limits of viewing computing as akin to human thinking. You're more sympathetic to augmented intelligence, of using powerful machines as collaborators to human creativity. Why is that?

WI: When you look at the history of the past 50 years, the leaps have come from forging more intimate connections between humans and machines, rather than creating machines that don't benefit from the connection of human creativity. So the concept that we're about to reach a singularity where machines will be able to do things without us doesn't seem to follow the data points we have of the past 50 years.

Q: Critics say the sharing economy is a euphemism for low-grade contract work. That it's a mechanism to evade regulation and gloss over the real need for affordable housing, public transportation and basic incomes. You're from New Orleans, a place that's familiar with inequality and neglected public infrastructure. Do you see in the app and sharing economies innovation or a retreat from public projects?

WI: Everything from Uber to Airbnb represents great opportunities, including everything that sprung up with the app economy. I'm a little bit dubious of trying to regulate these things away in pursuit of some economic theory. I do believe that economic inequality and a lack of economic opportunity for much of our society is the political, economic, and moral issue of our time.

I wouldn't blame the app economy or the sharing economy for either causing this problem nor solving this problem. But I think if we dedicate ourselves as a society to making sure that digital tools are used to produce economic opportunity for everybody and a shared prosperity for everybody, that would be great for the whole economy and more importantly it would be the moral thing to do.

Q: You mention as a telling milestone that in 2011 Apple and Google spent more on lawsuits and patents than they did on R&D for new products. And in your discussion of Bill Gates, you flesh out the capitalistic impulse of sheltering intellectual property, attracting investments, and of protecting what's yours. Has this proprietary approach to computers and software crowded out a more cooperative, academic, public-domain model?

WI: I think that the open-source approach coexist very nicely with the more proprietary approach and it's good to have them both in tension with one another. We are seeing that with Linux versus Microsoft's operating system, for example. We see that in iOS versus Android. It's good to have different models of creating things.

Back in the 1970s when Steve Wozniak saw the specs for a microprocessor he made a wonderful circuit board with a monitor and keyboard and he gave it away to all the members of the Homebrew Computer Club. But his friend from down the street, Steve Jobs, said: No lets go to my family's garage and we can make these things and sell them. And that's how Apple was born. So I think both instincts can be a spur to creativity. But the fact that they both exist is better than relying on either approach on its own.

Q: What should we take away from the history of the digital age?

WI: America is still the most fertile ground for innovation because we have rebellious and curious people. We have an entrepreneurial spirit, a tolerance for risk and failure. However, there are some things we should pay attention to. One is making sure that everybody gets included in this revolution, including people born in less privileged zip codes and including women.

It's very important that we use our technology to improve the educational opportunities for all, rather than focusing only on apps that, you know, crowdsource the ratings of restaurants.

That's not something government can force. I think it's what we all do as a society ever since the days of Benjamin Franklin. To use his words: "How can we do well by doing good?" I think the next phase of the digital revolution can be more inclusive. I hope.

Image: AP