I just put Microsoft’s new holographic glasses on my face. It’s one of the most amazing and tantalizing experiences I’ve ever had with a piece of technology.
This piece was originally published on 1/21/15, but we thought you might like to know what Hololens is all about—since Microsoft just re-introduced it at the Build dev conference.
Holographic glasses? What? Okay, I’ll explain: Microsoft HoloLens is a headset that lets you see virtual objects and environments as if they existed in the real world. It’s basically exactly what Magic Leap promises to deliver, except this time I can independently confirm it exists and that it legitimately blew my mind.
Like when I broke through a real-life wall with a Minecraft shovel and found veins of precious ore inside. Or when I installed a real-world light-switch in less than six minutes, with a guy named Joe on Skype drawing circles around the wire nut and voltage tester I needed to avoid frying myself. Or when I set foot on the surface of Mars without ever leaving my office, helping a ghostly NASA scientist assign tasks to the Mars Rover.
How is any of that possible? I don’t know. Microsoft hasn’t yet said how any of the technology works—though I’ll share my guesses with you in a minute. Microsoft merely led a giant group of journalists down into the depths of its Redmond, Washington headquarters, plopped a headset on their heads, and showed them how amazing it could all be. Oh, and they took all our cameras and phones away so we couldn’t take any pictures.
And the reason it was amazing is not because I could see a virtual world, like the Oculus Rift, and not because I could still see my surroundings and not trip over things, like Google Glass. It’s because Microsoft has found a way to merge reality and CG together.
Walking down a narrow corridor in the basement of Microsoft’s Building 92, I was ushered into a tiny room. I sat down in a chair so they could sling a giant processor unit around my neck, and gently place the HoloLens prototype on my head. The prototypes are fragile right now, apparently fragile enough that we weren’t allowed to touch the business end—only the straps. One Microsoft employee cinched those up by tightening a dial on the rear end, while the other typed my IPD (interpupillary distance) into a connected PC. Microsoft says the final version will automatically measure the distnace between your eyes, but the prototypes don’t have that feature yet.
And then I was looking at the surface of Mars. Or a narrow sliver of it, anyways. It’s not like the Oculus Rift, where you’re totally immersed in a virtual world practically anywhere you look. The current HoloLens field of view is TINY! I wasn’t even impressed at first. All that weight for this? But that’s when I noticed that I wasn’t just looking at some ghostly transparent representation of Mars superimposed on my vision. I was standing in a room filled with objects. Posters covering the walls. And yet somehow—without blocking my vision—the HoloLens was making those objects almost totally invisible.
Some of the very shiniest things in the room—the silver handle of a pitcher, if I recall correctly—managed to reflect enough light into my eyes to penetrate the illusion. But otherwise, Mars was all around. Everywhere I turned my head, I saw (a narrow sliver of) the Martian surface.
Except—incredibly enough—for a desk with a computer, which HoloLens had somehow managed to omit from the invisibility trick. The real-life computer was there with me on the surface of Mars, and a Microsoft rep had me come take a peek at its screen. “Nothing you’re seeing here has been touched by an artist,” he explained, showing me how the scenery all around me was generated directly from the real pictures and telemetry data this Mars Rover has been beaming back to Earth.
But HoloLens isn’t just a passive experience: it also gives you a little bit of control. I could move my head around the Martian landscape to highlight objects, then set waypoints for the rover (or spots to fire the ChemCam laser) with a flick of my index finger. Waypoints which would also show up on the 2D map on the computer screen. And I could do it collaboratively with someone else: halfway through my demo, a ghostly avatar of an actual NASA scientist appeared to show me how to zoom in on distant terrain and highlight points for future exploration.
And just like that, I was whisked off to another room to talk to Joe over Skype. Apparently, Joe is a friend of mine who volunteered to help me play amateur electrician, installing a real light switch into a real wall. Compared to Mars, it was a pretty simple augmented reality demo—Joe just pointed me to objects and drew some diagrams so I wouldn’t get lost—but it still felt super practical.
You know how if you’ve got a Kindle Fire tablet, you can summon a real, live customer support agent to help you find something good to buy or maybe read you a bedtime story? Now that I’ve had the experience of installing a light switch, in six minutes flat, with a personal assistant guiding and reassuring me every step of the way, I can definitely how people might pay for help with their DIY projects. (I can install my own light switches, thanks, but I’ve always been afraid to open up my car’s engine.)
I didn’t get to build my own 3-D printed quadcopter in HoloStudio, the third demo of the day—Microsoft just sat journalists down in comfy chairs to watch someone else do the work. And I suspect it’s because of one of HoloLens’s current weaknesses: there’s just no good way to reach out and touch objects right now. It’s actually pretty impressive how you can build 3D objects just by pointing your head at what you want to select and wagging a finger, but I got tired just looking at our demonstrator having to constantly keep his head moving to select parts and move them around. That’s gotta get old fast.
But it was pretty cool to see him just say “Copy” when selecting a wheel for a shiny red CG truck—along with a “magnet” tool—to immediately generate copies of that wheel which would automatically magnetically attach to the truck’s axles.
And it was really freaking cool to play Minecraft—or rather the Minecraft clone that Microsoft’s calling “Holobuilder”—with real-world objects. Imagine if your walls and surfaces were made of Minecraft bricks. What would happen if you punched through? I knocked a hole in a table that let me see THROUGH the table, down into a cavernous underground with a giant lava pit at the bottom. I shot through a real-life wall and found a cave on the other side. Because Microsoft’s glasses made that section of the wall invisible, duping my vision to let me see the cavern instead, it felt surprisingly real.
As real as Microsoft’s hefty prototype would allow for, anyhow. Saying it’s a work in progress would be an understatement. The final version is supposed to be a small, light, tether-free set of glasses you wear on your head, but what I tried involved draping a processor unit around my neck, which was tethered to the ceiling and then to a nearby PC. The prototype headset’s a little front-heavy, has a tunnel-vision-narrow field of view, and exposed circuitry everywhere. (Which I was happy about: I spotted at least four cameras, a laser, and what looked like ultrasonic range finders, in case that helps you speculate how it works.)
But it still felt like magic. The blend of real and virtual is as compelling as I’d hoped. And it makes me incredibly curious how Magic Leap’s version of the tech stacks up.
Follow the author at @starfire2258.