Remote Controlled Mice Today, Remote Controlled Humans TomorrowS

You may not have to wait until the year 2154 for your own remote-controlled body. Mark Stephen Meadows discusses wetware technology and how the science-fiction of Avatar is quickly becoming science fact.

During a radio interview in December of 2009, I was asked, "Do you think the vision of Avatar is something we'll see in the future?" I paused for a second and made my Jake Sully wish list. What do we need to make Avatar happen, roughly?
First is data transfer; you have to be able to drive the system at a distance. The myoelectrics and BMIs can work locally, and we've also seen that they can work at a distance. So, remote control; we've seen the U.S. Army driving UAVs this way. Check.
Second is output. You have to think to affect the interface. You'll be lying down in a tank and you'll be rigged up to some kind of myoelectric or BMI (or combination thereof) interface. We've seen Cyberdyne and Honda both driving robots this way. Check.
Third is input. Pumping the arms and legs is one thing, but there's a bigger trick of moving sensory data into your head. Moving data into your little vampire-coffin isn't the problem, but getting visual data into your eye could be. We've learned a bit about retinal implants and cochlear implants functioning today, so it seems that visual or auditory information could be converted from analog to digital, or vice versa, and could be sent into and out of the brain. Now, whether we end up having to break the skin to get it there is another question, but with that magic 144 years of future stirred in, let's call it a check. So those are the outlines for a remote neuroprosthetic.
Fourth is the system-the avatar itself.

***

I have to pause for a moment and tell you about one of the weirdest things I've come across in my travels, which is the notion of exactly what is needed for item number four. It is called a hybrot.
In the early 1990s, a number of scientists managed to establish a dialogue between a computer simulation and a wad of neurons in a Petri dish. Literally, the technique is called "dynamic clamping" and it works by taking a cluster of brain cells and soaking them in chemicals to tease them apart. Then, by chemically welding them to an electrical circuit board, you can measure the input membrane potential from one neuron and inject the output (the current from that neuron) into another. Hijacking the current, you can then interface it with a standard computer. It's a simple idea which presents a pretty reductionist view of the brain as a linking of inputs and outputs. The dynamic clamp method can be extended from the cellular level to the systems level, artificially monitoring and constraining the relationship between the neural system, the computer, and the behavior. It's wetware hacking.
Dr. Ben Whalley from the University of Reading in the UK has created a hybrot that splices rat-brain neurons to a small robot, which navigates via sonar. Dr. Whalley is teaching the system to steer itself so that it avoids obstacles and walls in its little home. Or box. Or maze. Or wherever a rat-brained hybrot lives. The blob of about 300,000 nerves was plucked from the neural cortex in a rat fetus and chemically treated to dissolve the connections between the individual neurons. These were then re-spliced so that sensory input from the sonar would allow the system to learn, adapt, and eventually recognize its surroundings.
Researchers at the University of California, Berkeley, have controlled a rhinoceros beetle with radio signals and demonstrated it in a flight test at the Institute of Electrical and Electronics Engineers (IEEE) Micro-Electro-Mechanical Systems (MEMS) 2009 conference.
And in 2007, at Chicago's Northwestern University, Sandro Mussa-Ivaldi and other researchers chemically welded the brain of a lamprey eel with a robotic hockey puck. The hybrot can track a beam of light in a laboratory dome. The eel's brainstem is soaked in a saline solution, receives input from light sensors, and directs the wheels where and when to move. I can't even guess at that thought process. I guess it's like a tiny bull chasing a matador's cape. Note that these are eels, rats, and beetles that are being used. None of them are creatures that we eat. While obviously a brutal crew, these researchers have the délicatesse to avoid making bunny-hybrots, or kitty-hybrots. None of the hybrots made today are critters we think of as friends or household pets. No, there is a marketing line that these researchers must not cross, and it is defined by publicly held ethics. As the years go by, the researchers will be allowed to move further up the food chain, but not for some years will human brain material get in the stew. And when it does, ethical questions of free will and volition will surely have fallen to the wayside in favor of mechanistic arguments of defense and safety. So how deep can this go? Biotechnology can reach pretty far down.
As if integrating hardware and wetware wasn't enough, in May of 2010 it was announced by the J. Craig Venter Institute that they had used a synthetic genome to control bacteria, which amounts to building software for a living organism. If that can be done, then it means that other genomes could be created, including a human genome that could be combined with the genome of other systems, such as, well, anything that runs on genes and chromosomes, which is most anything that's living.
We are now arriving at a point in which hardware, wetware, and software are no longer being cut up, nor even hacked, but actually blended. Is this the future for what's depicted in the movie Avatar? Having a little lamprey-eel toro toro in his cage is a bit different than jumping onto the back of a giant red dragon from your medium-size green dragon, or making love in a glowing garden, but with these thoughts in mind, did I think "the vision of Avatar is something we'll see in the future?"
"Of course," I replied. "I see no reason why not." Check.

Remote Controlled Mice Today, Remote Controlled Humans Tomorrow

Mark Stephen Meadows is the author of I, Avatar: The Culture and Consequences of Having a Second Life and Pause & Effect: The Art of Interactive Narrative. The award-winning co-inventor of four patents relating to artificial intelligence and virtual worlds, he is a respected international lecturer and the founder of Echo & Shadow and HeadCase Humanufacturing-companies involved with artificial intelligence. He divides his time between North America and Europe.

© 2011 Mark Stephen Meadows – excerpted with permission of Globe Pequot Press, Guilford, CT