Will Lasers Save Intel's Google Glass Clones From Sucking?

The sweet laser projector on Intel’s new Vaunt glasses. (Screenshot: YouTube/The Verge)
The sweet laser projector on Intel’s new Vaunt glasses. (Screenshot: YouTube/The Verge)

Intel thinks it’s figured it out. The company that makes the CPUs in most of our computers has produced a set of glasses that could do what Google Glass could not—and what Magic Leap desperately wants to. It’s produced a pair of “smart glasses” that look like something you might actually wear.

Advertisement

Well, not necessarily in public: Intel’s Vaunt glasses look more like the safety glasses people wear around heavy machinery than the corrective ones currently resting on my nose. But these new glasses shoot lasers into your eyes to deliver the information that Google needed a big ugly arm (and Magic Eye needs a set of goggles) to transmit. While that’s tremendously exciting, after reading The Verge’s big story on the Intel Vaunt glasses, I don’t know if it’s enough.

Let’s be clear: I want a personal PC that filters information directly into my eyeball. I want to look at a restaurant and know if the food will be good, or shake someone’s hand and instantly know their name. I want to be able to know who’s calling with a single glance to the side, and to check my Twitter feed with a flicker of my eye. I am the asshole gleefully excited for my cyberpunk future, but I also don’t want to see people fuck it up again, because Google Glass bungled the job so badly it set personal HUDs back at least half a decade or more.

Advertisement

Intel’s take appears to be a supremely conservative one. Instead of a multicolor display that augments reality, the Vaunt glasses reportedly shine a small red display into you eyeball using a Vertical-Cavity Surface-Emitting Laser—that’s the kind of laser found in your printer, mouse, and in the dot projector on the iPhone X. According to Intel, the version they use is low-powered enough to cause zero damage to your retinas.

Which is great, because you don’t want to risk blindness for the relatively small amount of data Vaunt glasses would provide. It only gives you notifications you’re accustomed to getting from your smart watch or the lockscreen of your phone. Here’s a screengrab from the Verge’s video that gives you an idea of what it would look like.

This is supposed to be the laser’s image “painted” onto your retina along side how you would see the text while actually looking with your eyeball. (Screenshot: YouTube/The Verge)
This is supposed to be the laser’s image “painted” onto your retina along side how you would see the text while actually looking with your eyeball. (Screenshot: YouTube/The Verge)

That’s definitely not Magic Leap levels of AR—that’s not even the same level of AR as found on your smartphone. It’s more like the stuff Pontiac rolled out in some of its cars more than a decade ago. Because it’s so simple, Intel can get away with packing it into a much smaller package. So if you wear the Vaunt glasses, most people wouldn’t immediately know you had a HUD strapped to your face.

Advertisement

That’s a critical first step in HUD adoption. Despite what stuff like Netflix’s Altered Carbon promises, no one actually wants to look like a big nerd as they interact with invisible computers. But Intel’s conservatism could harm it if something like Magic Leap takes off. The Vaunt glasses’ low resolution (400x150 pixels) monocolor display may disappear from view when not in use and require no adjustment for people with bad eyesight, but it also doesn’t have any way of gathering data beyond what it can pull from your phone. So it can’t perceive the world around it and adjust accordingly. You won’t be able to look at a plate of food and have the glasses ID the pasta type.

The Verge gives a nice run down of the glasses in this seven-minute video.

You won’t even, necessarily, be able to interact with the glasses except for when using your phone. According to The Verge, how one interacts with the glasses is still up in the air. The current test models made available to The Verge had a compass and accelerometer built in, but nothing else.

Advertisement

However Itai Vonshak, head of products for Intel’s New Devices Group, did give an example of using the glasses with Alexa—sort of like the glasses Vuzix was showing off at CES this year. That strongly suggests a microphone and speaker could be added at a later date.

But is shouting at one’s glasses really the future of interaction with wearable tech? Will we really talk to Alexa as we ride the train or saunter through the mall? Or will we use a controller, as Magic Leap suggests? Or gestures, as with Microsoft’s Hololens?

Advertisement

Intel’s Vaunt seems to take us a step closer to personal HUDs, but the biggest question still remains: How the heck are we supposed to interact with these computers of the future? “We really believe that it can’t have any social cost,” Vonshak told The Verge. “So if it’s weird, if you look geeky, if you’re tapping and fiddling—then we’ve lost.”

Too bad Vonshak hasn’t explained how Intel will “win.” The personal computer didn’t become common until the mouse. The smartphone didn’t grow popular until Apple came up with the pinch and zoom. It’s not enough to build new tech—one has to resolve how we interact with that tech in the most natural way possible. According to The Verge, Intel still doesn’t know what that interaction will look like.

Advertisement

One can only hope Intel figures it out soon. The Vaunt glasses will be made available to developers later this year and will work with Android and iOS devices. The only thing potential users will need is their pupillary distance, which anyone with eyeglasses will already have on hand from their optometrist. People with perfect vision will probably need to make an appointment.

[The Verge]

Advertisement

Senior Consumer Tech Editor. Trained her dog to do fist bumps. Once wrote for Lifetime. Tips encouraged via Secure Drop, Proton Mail, or DM for Signal.

Share This Story

Get our newsletter

DISCUSSION

carl-damas
alphashadow

I’m starting to see why some people get left behind by technology as they age. I’m very young and all this makes me think of are ads being beamed directly into my eyeballs. I already spend more time than I want to (not zero) swiping away notifications on my phone that turn out to just be ads or reminders I didn’t need. Firm pass on this.

To add a relevant comment or question and not just be a humbug—if this takes off it’s only a matter of time before they make their way to corrective lenses. So now we’re going to add another device to the list of things to check for when people can’t/shouldn’t have information. It’s great for turn-by-turn directions but doesn’t that mean people can check texts? Will those apps just be disabled when the glasses detect that you’re moving at a certain speed? Teachers will have to check that students aren’t wearing smart glasses the same way they have to check for phones during exams.

And you want to be able to know people’s names when you look at them, but what about what they want? We’re going to need massive face registries for both allowing and disallowing such features.

I realize these are all problems that do and would exist without these, but I hope someone is thinking about them. I guess I’m just not an instinctive cheerleader for tech anymore.