The faceputer ads say virtual reality is coming and it's gonna work this time. But here's some real talk: There are still many ways virtual reality cannot fool the human brain. And it has little to do with the tech itself. Instead, it's about neuroscience and our brain's perceptual limits.
True, the past year has brought a great flourishing of virtual reality systems that are miles better than the clunky, nauseating devices of the 90s. The HTC Vive and Sony's Project Morpheus were just unveiled. Oculus is chugging along since its $2 billion acquisition by Facebook last year. Magic Leap is doing whatever the hell it's doing.
This new set of devices is good enough to feel stomach-droppingly real—even though the images are still pixelated and lag a tiny bit. People in VR call this overwhelming feeling "presence." But it's possible to fool one part of the brain without fooling another.
When journalists write about being wowed by the latest fancy VR device, they mean the emotional gut punch of say, looking down a castle wall at an invading army. They don't mean that VR is indistinguishable from reality. As Jason Jerald, a technology consultant for VR companies, puts it, "We can get very engaged in cartoon-like worlds." Images don't have to look perfect for presence.
But these imperfections become obvious if you spend more than the typical few minutes of a press junket inside VR. Or try to walk and turn. There are many reasons, both conscious and unconscious, that your brain rejects the reality of a screen mounted a few inches in front of your eyeballs.
Latency and the Age-Old Problem of Motion Sickness
Call it motion sickness or "simulator sickness" or "cybersickness," but the nausea is real and has long bedeviled virtual reality. The main reason is latency, or the tiny but perceptible delay between when you move your head in VR and when the image in front of your eyes changes—creating a mismatch between the motion we feel (with our inner ears) and the image we see (with our eyes).
In real life, the delay is essentially zero. "Our sensory system and and motor systems are very tightly coupled," says Beau Cronin, who earned his PhD in computational neuroscience at MIT and is writing a book on the neuroscience of VR.
In virtual reality, however, the latency can get as low as 20 milliseconds, though it can go up quite a bit depending on the exact application. It will never be zero because it will always take time for a computer to register your movements and draw the new image.
So how low does latency have need to be before you don't notice it? Jerald, who did his doctoral research on the perceptual limits of latency, found that it varies wildly: His most sensitive subjects could notice lags of 3.2 milliseconds, the least sensitive hundreds of milliseconds. Indeed, sensitivity to simulator sickness can vary wildly, too. It may never be possible to design devices that make no one motion sick, but it is likely possible to design certain applications that don't make most people sick.
My Eyes! The Vergence-Accommodation Conflict
A weird thing happens in VR: You can look at the far-off horizon of a virtual beach but still feel like you're in a room. This could be partly the result of subtle feedback from the muscles surrounding your eyes. At its worst, it can cause painful eyestrain and headaches.
Here's what happens. Put a finger in front of your face and gradually move it to your nose; your eyes will naturally move closer together to track your finger. This is vergence, where our eyes converge and diverge to look at close and distant objects, respectively. At the same time, the lenses in your eyes focus so the image of your finger remains clear while the background is fuzzy. This is called accommodation.
In VR, however, vergence and accommodation no longer integrate seamlessly. The screen of a typical head-mounted display sits three inches or so in front of your eyes. A set of lenses bends the light, so the image on the screen looks about one to three meters away. However, any objects further or closer than that can look blurry. And the entire screen is always in focus, no matter where your eyes are looking. This can make spending an extended period of time in VR pretty uncomfortable.
There are high-flying ideas about how to get around the problem, and the name on everyone's lips is Magic Leap. The company hasn't publicly revealed much, though its patents show an interest in light field technology, where a screen of pixels is replaced by an array of tiny mirrors that reflect light directly into the eyes. The objects rendered through light are supposed to achieve true depth, coming in and out of focus as a real object might.