It’s well-known that Facebook’s partnering with Ray-Ban to develop a pair of augmented reality glasses. What’s less clear is how Facebook envisions these glasses will function, and how the company imagines people will interact with the device. A new Facebook Reality Labs blog sheds a little light on that front—and it possibly involves haptic gloves and “soft” wristbands.
Facebook Reality Labs is essentially a group of researchers, developers, and engineers working on virtual and augmented reality. Every so often, they publish deep dives into the challenges and potential of AR. This time around, FRL is addressing the interface problem with smart glasses. Namely, even if you have a bunch of notifications popping up in your field of view, you need some kind of way to interact with what you’re seeing. The now-defunct Focals by North, as well as the Google Glass Enterprise Edition 2, both had discreet finger loops that let you navigate menus. Others, like Epson’s Moverio glasses, rely on your smartphone. Neither of these methods is particularly intuitive, and it’s one reason why smart glasses just really haven’t taken off.
The FRL blog lays out a theoretical day of wearing Facebook AR glasses, along with what it calls a “soft wristband.” Basically, you go to a cafe and your smart glasses ask if you want to play a podcast. Instead of having to answer via your phone or finger loop, you could flick a finger and the wristband would interpret that as clicking an invisible play button. The blog then outlines a scenario where you’d be able to pull out a pair of “soft, lightweight haptic gloves” that then signal to the glasses to project a virtual screen and keyboard.
What FRL is describing isn’t as futuristic as you might think. It’s essentially tapping into something called electromyography (EMG), which harnesses electrical signals traveling from your spine to your hand. This tech already exists—the Mudra Band is an Apple Watch band prototype that lets you control certain functions by flicking your fingers. It, too, does this by reading electrochemical signals produced by your nervous system. When I spoke to the Mudra Band’s creators at CES, they also envisioned the band potentially being used for AR and VR controls. Facebook isn’t the only company with this idea.
Then there’s the haptic gloves, which Facebook believes to be an “ultra-low-friction input.” Or more simply put, gloves are much more natural to use than tech like hand-tracking cameras, microphone arrays, and eye-tracking. Haptic feedback is also supposedly an easy way to give a user feedback regarding the virtual objects you’re interacting with—kind of like a phone vibrating. Ultimately, it seems Facebook’s betting on “soft, all-day wearable systems” or “devices worn close to or on the skin’s surface where they detect and transmit data.”
It’s admittedly a clever approach, and as the blog details, would enable a more intuitive way of interacting with smart glasses and virtual environments. If you could “click” buttons with discreet finger movements, you wouldn’t necessarily have to scroll through menus. The interface could then be designed around “yes” or “no” questions, so long as the AI was powerful enough to interpret what you want in a given situation. (You can peep a concept video of what that interface might look like.)
That is admittedly a big “if” and probably not something we’re going to see in whatever the first, forthcoming iteration of Facebook’s smart glasses is. Facebook Reality Labs itself says in the blog that the sensing technology and highly personalized data needed to train an AI inference model simply does not exist yet. Still, the concept is surprisingly thoughtful, considering just a few weeks ago Facebook stupidly said it was mulling facial recognition for future smart glasses. Honestly, it would be great if Facebook continued investing more in ideas like these for its smart glasses, instead of creating more privacy headaches.