The team that brought us the first neurally remote controlled beetle has a new paper out today proposing a huge step forward in brain-computer interfaces. In an article published the arXiv Quantitative Biology archive, Dongjin Seo, Michael Maharbiz, and colleagues from the University of California at Berkeley propose a system they call 'Neural Dust'.
Brain-computer interfaces today generally work by sticking electrodes into the brain. That runs into severe problems - all those electrodes cause trauma as they penetrate the brain, and leave wires that wind through and between the brain's neurons, potentially attracting scar tissue or driving the brain to reject the implant.
The proposed 'Neural Dust' system (and it's just a proposal for now) would work differently. The authors propose sprinkling tiny dust-sized passive silicon sensors, about 100 microns across, throughout the human cortex. These sensors would be about as big around as the thickness of a human hair, and fabricated via a CMOS process, the same as used for many conventional computer chips.
They'd be sprinkled through the cortex. Just outside the cortex but underneath the skull would be a larger chip, still only millimeters across, that would communicate with them via ultrasound. The use of ultrasound would let the neural dust particles send information pulses without disrupting the activity of neurons around them, and without needing to have any wires or electrodes getting in the way.
Outside the skull entirely would be the external transceiver. This chip would provide wireless power for the chips inside the skull, and would read the data out from the brain, allowing scientists and engineers to see what's going on inside.
The brilliance of this system is that it could potentially allow scientists to see what's going on with thousands, tens of thousands, or even hundreds of thousands of neurons inside the brain at once, by sprinkling enough neural dust particles and sub-dural transceivers.
Of course, it's still only theoretical, though the researchers have published the paper in part to seriously work through the knot of design questions involved and to stimulate others to take steps towards fabrication. The neural dust proposal would also be only one way - it would allow data to be pulled out of the brain - to allow you to control a robot arm, say, or to let someone see what you're thinking or feeling - but not allow data to be transmitted into the brain - to give you touch feedback on that robot arm, or to show you what someone else is thinking or feeling.
Even so, I'm delighted to see this. In fact, I learned of it when someone tweeted this paper at me, suggested it might be a step towards the brain-linking nano-drug in my novel Nexus. Heh. I think they just might be right. And I sure hope so.