The Future Is Here
We may earn a commission from links on this page

Biotech Breakthrough: Monkeys can feel virtual objects using a brain implant

We may earn a commission from links on this page.

It could be the first step towards truly immersive virtual reality, where you can feel the computer-generated world around you. An international team of neuroengineers has developed a brain-machine interface that's bi-directional. That means the monkeys can use this brain implant not only to control a virtual hand, but also to get feedback that tricks their brains into "feeling" the texture of virtual objects.

Already demonstrated successfully in primates, the interface could soon allow humans to use prosthetic limbs (or even robotic exoskeletons) to actually feel objects in the real world.

Advertisement

Before we get ahead of ourselves, let's explore how all this works. When you're wearing a pair of big bulky gloves, the sensory information usually provided to your brain by your fingers is deadened by the barrier between your hand and your keys. The result is a one-way interface; your brain can tell your fingers what to do with the keys, but communication from your fingers back to your brain is effectively cut off. As a result, you have to rely on another sense — usually vision — to tell if you're currently pinching one key, three keys, or no keys at all.

Advertisement
Advertisement

To really make the most of your fingertips, there needs to be a two-way interface between your brain and your hands. When your brain can receive tactile information from your hands about, say, the texture of the key you're handling, it can make near-instantaneous adjustments that give you better dexterity, or help you choose the right key.

Brain-machine interfaces have come a long way in recent years, but, with few exceptions, these systems have depended pretty much exclusively on one-way interfaces.

To demonstrate the power of a two-way interface, a team of neuroengineers at Duke University designed a brain-machine-brain interface (BMBI) to test on monkeys.

Advertisement

"This is the first demonstration of a brain-machine-brain interface that establishes a direct, bidirectional link between a brain and a virtual body," said Miguel Nicolelis, who led the study. "In this BMBI, the virtual body is controlled directly by the animal's brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal's cortex."

Here's how it all works: the BMBI takes movement commands from 50—200 neurons in the monkey's motor cortex and uses them to control the operation of a virtual, "avatar" hand, not unlike a classical one-way interface. But the new interface also implements a feedback mechanism, wherein information about a virtual object's texture is delivered directly to the brain via something known as intracortical microstimulation, or "ICMS" for short. When a monkey receives feedback in the form of ICMS, thousands of neurons in its brain (neurons that actually correspond to tactile feedback in the hands) receive electrical stimulation via carefully placed electrodes.

This two-way interface allows for the monkeys to engage in what the researchers call "active tactile exploration" of a virtual set of objects. Using only their brains, monkeys were able to direct their avatar hand over the surfaces of several virtual objects and differentiate between their textures.

Advertisement

To prove that the monkeys could pick out specific objects based on tactile feedback, the researchers would reward monkeys for selecting objects with a specific texture. When they held their virtual hand over the correct object, they were given a reward. The study looked at the performance of this task by two monkeys. It took one monkey just four attempts to learn how to select the correct object during each trial; the second, only nine.

"The remarkable success with non-human primates is what makes us believe that humans could accomplish the same task much more easily in the near future," explains Nicolelis. He continues:

Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton.

Advertisement

The future seriously can't get here soon enough.

This research was largely funded by the National Institutes of Health, and is published in the latest issue of Nature

Advertisement

Top image via 3DDock/Shutterstock; Gloves & Keys Via; Virtual Monkey via Nature
Video by the Nicolelis Lab, Duke Center for Neuroengineering