Futurists R.U. Sirius and Jay Cornell have published a witty, snide, and incredibly informative new book titled, Transcendence: The Disinformation Encyclopedia of Transhumanism and the Singularity. We spoke to the authors to learn more about the project and their own personal visions of the future.
Many of you will remember R.U. Sirius — aka Ken Goffman — from his co-publishing work at MONDO 2000, the first popular digital culture magazine (and a kind of precursor to io9). He's a "digital iconoclast" who has written about technology and culture for Wired, BoingBoing, Rolling Stone, and many other publications.
Jay Cornell is a writer, editor, and web developer. He's also the former managing editor of h+ magazine, and the former associate publisher of Gnosis magazine. He's quite active in the futurist community, currently serving as a member of the Board of Advisors of the Lifeboat Foundation, a nonprofit organization dedicated to defending humanity from existential risks.
Together, they've put together a book that offers a multilayered looked at the accelerating advances in a number of fields tied to both transhumanism (i.e. human enhancement) and the Singularity, including AI, cognitive science, genomics, nanotechnology, cryonics, space exploration, synthetic biology, and robotics. Importantly, the book also considers the influence of historical predecessors and personalities to these ideas, from Timothy Leary to Ray Kurzweil.
Here's my recent conversation with them.
io9: Tell us about the book and why you chose the encylopedia-like format?
R.U.: The aim was to create an irreverent but informative book about these ideas and try to get them to a diverse audience. It serves as an introduction to a number of the sciences, technologies, memes, organizations, and a few of the people who make up the cultures of transhumanism and the Singularity.
I trust that this will also amuse, and maybe upset, a lot of already extant transhumanist sorts and that they'll also find stuff in their they didn't know or hadn't thought about in the peculiar ways that I or Jay think about them.
I did an A-Z "user's guide" to cyberpunk/cyberculture in 1993 under the auspices of Mondo 2000 magazine (with help from Rudy Rucker and Queen Mu). That was pretty successful at bringing this then-mysterious new digital terrain into popular culture. I think the timing is right to do something similar for (or to) transhumanism. It's a good way to organize information and thoughts. It worked pretty well for Voltaire.
Jay: In our short-attention-span digital culture of Tweets and Vines and listicles, I think a book structured this way is going to be more accessible for many people. It also helped make collaboration easier.
io9: Transhumanism as a concept has been around for many years, yet for many it remains a very controversial and niche idea.
Jay: For many people, it's just too far out. They think about their jobs and their health and their family, not about whether they'll be able to upload their brain to a computer in 30 years. If they do think beyond the near and personal future, they're probably worrying about climate change or resource depletion or government debt, and not thinking about promising technological developments.
Another factor is that changing human beings in "unnatural" ways is, on some level, inherently disturbing. Even people who are OK with the transgendered might feel uncomfortable around someone with facial tattoos and horns, or someone whose brain is hooked directly to the internet, so that they're recording you and Googling you while they're talking with you. This orientation toward the "natural" is also characteristic of many religions. For many Christians and members of other Abrahamic faiths, transhumanism is a form of idolatry or "scientism," and conquering death and transforming humans amounts to "playing God."
R.U.: Honestly, as with any -ism, I think the believers tend to chase people away from the belief.
Of all the people I know who claim to hate transhumanism, I think they're all transhumanists in the sense that they tend to support technological enhancements and radical tech developments and are generally fairly tech progressive. If I may quote myself, I threw this to Charlie Stross, a frequent critic of "transhumanism," in a conversation in h+ magazine in 2012: "You said 'I'd be very happy with cures for senescence, cardiovascular disease, cancer, and the other nasty failure modes to which we are prone, with limb regeneration, and tissue engineering and unlimited life prolongation.' It seems to me that this still puts you in the transhumanist camp."
Transhumanism is a good way to take isolated radical tech developments and bundle them together so that we can see where we might be headed; what the possibilities are, both wonderful and terrible. Some of the people who take it too seriously can be kind of scary, but in a marvelously science factional sort of way. Science fiction is pretty much dining out on nothing but transhuman concepts these days.
io9: What's your take on contemporary futurist culture?
Jay: I find it fascinating how concepts of the future shift over time. There are always positive and negative views, but usually one is dominant. I think the 19th and 20th centuries were unusual in having many periods that were mostly techno-optimist. Today it feels like the largely optimistic feelings of the '90s have been superseded by pessimism caused by 9/11 and the various unpleasant political and social developments that followed. We hope this book helps people see the optimistic side of the future again.
R.U.: In the early '90s, we had a cyberculture. We were still informing and convincing people that our world was a techno-world. That's no longer necessary. So in a sense, futurist culture is presentist culture and it's everybody.
Regarding organized transhumanist futurist culture, there are the long time organizations like h+, formerly the World Transhumanist Association; and there's IEET. James Hughes of IEET and some folks recently released a "technoprogressive declaration," which I signed.
There's Zero State, which sort of leans towards left anarchist ideas, but emphatically doesn't advocate the complete abolition of the state.
There are mainstream institutions and companies like Google, DARPA, Singularity University (which shares physical space with NASA) and people like Peter Diamandas and Ray Kurzweil who mix with the powerful and influential.
There are outsiders and provocateurs like Grinders and Rachel Haywire and her various projects: her Extreme Futurist Festival kept the cyberpunk vibe going and I hope it continues.
There are Aubrey de Grey's various longevity research and advocacy groups like SENS.
There's a lot of interest and support among libertarian sorts — Reason magazine, Peter Thiel, Glenn Reynolds.
Max More, who is sort of the originator of transhumanism as a movement is still an important voice. I'd say the culture is pretty diverse.
There's Zoltan Istvan's Transhumanist Party and his 2016 presidential bid. His philosophy of Teleological Egocentric Functionalism and his Three Laws of Transhumanism are at the absolute edge of weird and scary and a fair number of people appear to be eating it unquestioningly. His party platform is OK, though, and he's a hella nice guy. He's definitely worked his way up to a high profile. I may just endorse, out of perversity.
Jason Silva is another mediagenic personality who is into a very upbeat, hedonic and friendly sort of Learyesque transhumanism. And David Pearce is also on the psychedelic transhumanism tip with his ideas for a "hedonistic imperative" and his abolitionist project to end all sentient suffering.
Oh my, the list could go on. I feel like I sort of sit at the center of it in this book and try to be both court jester and diplomat at the same time. It seems to be working, for the most part.
io9: Okay, let's talk about the benefits and risks of human enhancement. Just what, exactly, are the problems that transhumanists are trying to solve?
Jay: You could say that transhumanists would like to solve everything. Humans have always tried to solve problems like hunger, illness, ignorance, poverty, and so on. Transhumanists would go further: curing the sick is fine, but how about making the healthy even healthier? How about conquering death itself? So instead of merely trying to bring everyone up to "normal," transhumanism hopes to improve people in ways previously considered impossible.
There are risks, of course. Some things won't work, or will have negative side effects. One risk is increased social and economic inequality. As with most advancements, the first beneficiaries are the rich and the connected. Nobody wants a mentally-enhanced, immortal overclass of rich people and politicians whose enhanced children get all the best jobs and win all the awards. But it's worth remembering that when rich people spend extra in order to be first, it helps make it cheaper (and often safer) for the rest of us later on. The rich spent hundreds or thousands of dollars on the first digital watches 40 years ago, and society survived.
The risks I'm most concerned about aren't about enhancement. As genetic engineering becomes cheaper and easier to do, it becomes more likely that some terrorist or apocalyptic nut will try to create super-Ebola in their kitchen.
But besides benefits and risks, there's the category of disappointments. Many things never live up to the hopes of their proponents. Some people thought that the telegraph, dynamite, airplanes, and even motion pictures would end war. Even an easy, inexpensive method of intelligence increase would not be a panacea: high intelligence often just enables people to make more complicated mistakes.
R.U.: I think the greatest benefit could be that we either get more time away from hustling around for competitive advantage and livelihood and the greatest risk could be that we'll get no time away from it at all. Right now, the culture is about adapting to the speed of the tech. The culture could become about letting the tech be its thing, having it around and in us, but giving more of our attention to ourselves and each other and other modes of creative living and expression. I'm in the peculiar position of not being so in love with a lot of aspects of the culture of technology as it is right now. I see it as a necessary transition to a place where the tech is mainly invisible.
io9: How do you respond to people who describe the Singularity as the "Rapture of the Nerds?"
R.U.: I think it's funny! But I'm also agnostic when it comes to the Singularity. I think that people who are sure that it will happen and those that are sure that it won't probably have similar character structures but different temperaments.
Jay: It's a great quip, because it's got more than a little truth in it. People enthused about any secular ideology will often think and behave in ways similar to religious zealots. I think one of the strengths of our book is that we're not fanatical believers or cynical naysayers, so it's a sympathetic look at these topics, but we're not afraid to raise an eyebrow when we think it's warranted.
io9: A number of prominent thinkers have recently spoken about artificial superintelligence, calling it an existential threat. What's your take on the sudden interest?
Jay: It is interesting how a number of people all spoke up about this at the same time, isn't it? Did word go out on some secret email list? Because as far as I know, while artificial general intelligence has been making progress for many years, it hasn't made any huge leaps recently. Like controlled fusion, it always seems to be 20 years in the future. Maybe it's because so many people have iPhones and using Siri gets them thinking about this.
R.U.: Maybe it's because Ray was hired by Google. I think its mainly people who are pretty well off that are sounding off about this. Most people have more immediate existential threats or at least existential burdens to worry about.
io9: Of the many technologies described in your book, which are you personally most excited about? Which ones are you most skeptical about in terms of feasibility? Which areas should scientists and technologists focus their efforts?
Jay: There's a graphene membrane called Perforene which Lockheed Martin is developing. It could make water desalinization and purification much cheaper, which would be an incredible boon.
I'm skeptical about mind uploading. The idea is that we'll be able to transfer our minds to a computer, which would vastly speed up our minds and possibly achieve immortality. But the technological, psychological, and even philosophical problems are immense. Even if it can be done, would the mind in the computer stay sane, disconnected from all normal bodily senses, and thinking millions of times faster? And would that Digital You really be you, or just a copy of you? To an outside observer, Digital You might talk like you, but that won't be much comfort to your inside observer when old-fashioned Meat You dies.
R.U.: As a psychedelic veteran, I like neurotech. It's a complex young science, but having an awareness that one can feel really good while being as smart and functional as one can be, dependent on your neurochemistry, has been a life long enticement. (I suppose I'm talking about stimulants as much as psychedelics there.) That this could be done without the sloppiness and the aftermath of chemicals or even plants seems like a great potential boon.
I think nanotechnology is a comer that has been dismissed by many in the transhumanist world recently. It became hip to know better than to hope for big nanotech breakthroughs. Ending material scarcity while improving the environment should be the first principle of all radical technologists. Death is a scarcity of years, but beating it runs an important second place to a well-distributed end to the extremes of suffering under our current mortal condition. Of course, the nature of human society is that we have to do diverse things. I'm all for the ending death projects too.
I'm a bit skeptical that AGI will exceed human intelligence any time soon (this century) in a way that we can talk about meaningfully. My sense is that it will wind up being so different from human intelligence that it requires a different discourse.
I'm more skeptical about human politics and social adaptation and our skill at snatching defeat out of the jaws of victory. And I think that nasty weather (literally), conflict and bad memes are likely to seize the day during this decade and the next one and never let go. How's that for a deflating ending?
Get your copy of Transcendence: The Disinformation Encyclopedia of Transhumanism and the Singularity here.