Skip to content
Tech News

With a New Developer Framework, Building in XR is Easier than Ever for the Spatial Computing Revolution

With insights from Qualcomm’s XR executive Hugo Swart, discover how developers can move beyond the traditional 2D screen for more immersive experiences.
By G/O Media Studios for Qualcomm Inc.

Reading time 8 minutes

When it comes to the most valuable tech in our pockets, we’re on the threshold of a new era. According to Hugo Swart, the VP and GM of XR at Qualcomm Technologies, Inc., this powerful combination of mobile phone, headworn device, and spatial computing are transforming the digital landscape worldwide.

From Swart’s vantage point as the company’s global head of virtual, augmented, and mixed reality segments, there are overlapping forces at play making this the moment for the metaverse to make good on its promises for consumers and coders alike.

“Even though it’s still early compared to where we will be in about 10 years with XR adoption, if you want to lead, to disrupt and create the next big thing, the time is now,” Swart says. “Devices are getting much better every year, more users are adopting the tech, and the earlier developers start addressing this, the likelier they will be to find the sweet spot to differentiate.”

Hugo Swart, VP and GM of XR at Qualcomm Technologies, Inc.
Hugo Swart, VP and GM of XR at Qualcomm Technologies, Inc. Photo: Qualcomm Inc.

And Qualcomm, the software giant that pioneered smartphone-enabled extended reality 13 years ago, continues to accelerate the evolution of computing today. 2023 marks the second year of availability for its Snapdragon Spaces™ XR Developer Platform, along with the launch of a new feature called “Dual Render Fusion” which enables developers to make 2D smartphone apps AR Glass-capable.

In addition to providing users with a wider variety of immersive experiences, this functionality will make headworn AR more accessible to developers of all experience levels, encouraging innovation and bringing us closer to a future beyond traditional 2D displays.

We got a chance to sit down with Swart to discuss the future of computing, the social and industry impacts of an AR-capable world, and the potential of Snapdragon Spaces Dual Render Fusion to convert more developers to XR than ever before.

First off, can you clarify the differences between XR, VR, AR, and MR?

Hugo Swart: When it comes to the nomenclature we use at Qualcomm, XR is the umbrella term for any spatial computing device that allows you to interact with a digital space through an immersive headset rather than a 2D screen like your phone or your laptop.

Virtual reality involves complete immersion in the digital space. Augmented reality mixes elements of the physical and digital worlds with optical see-through glasses. Mixed reality projects also have physical and digital combined experiences by using video with pass-through capability on a fully-occluded VR headset.

How do you and Qualcomm characterize XR adoption and usage today, and where do you see it going in the coming years?

Hugo Swart: VR has gained plenty of traction with gamers, and we’re starting to see more enterprise and industrial use cases. However, there are still far fewer people using VR headsets than those who use PCs, laptops, or smartphones.

But looking at the next five to 10 years, there’s a transformation on the horizon regarding how we compute—from 2D screens (phones, laptops, TVs) to computing in space. Rather than just 3D, this means that your environment becomes part of the digital experience, imbuing things with a whole new level of realism.

A similar reference point was in the early 2000s, moving from the static internet on PCs to the mobile internet with the adoption of smartphones. The next step is the spatial internet, which users will be able to interact with through progressively sleeker, smaller glasses and headsets.

These AR glass-capable devices will become so much more immersive in the coming years, allowing users to merge their digital experiences with the physical world. Eventually, we expect XR devices to substitute smartphones and PCs altogether.

The comparison between smartphone adoption and XR headset adoption is an interesting argument. Can you unpack that a little more?

Hugo Swart: For some context, I started my career in the early 2000s in the mobile internet, when almost nobody used a smartphone. Now, almost every individual on Earth has one.

Back then, people asked the same types of questions they ask today about XR: “Why would I take pictures on my phone? Why would I watch TV on my phone? Why would I do a video call with my phone when I could just call someone the traditional way?”

But as the devices become better with better processors, connective and displays, and the content adapts to fit the device, it’s almost a no-brainer. The same thing will happen with XR.

What specific industries do you see XR significantly changing the face of in the coming years?

Hugo Swart: Virtually all of them to be honest. Many enterprises are already using VR as a training tool, which really supports how people learn and build muscle memory in a safe and repeatable environment. It’s great for education, in both the classroom and the workplace. It can teach you how to operate a dangerous and costly machine as a new employee, eliminating so many potential hazards.

Other businesses use XR to train employees on soft skills, such as conflict mediation and bias training. It’s much more effective to teach these interaction skills spatially rather than on a 2D screen.

Then there’s health, such as training for surgical procedures and physical therapy.

Then you think about entertainment, especially sports. With XR, you can bring the stadium experience into your own home or use the glasses at the physical arena for a richer physical experience.

The biggest way it will change all of our lives is by transforming social interaction.

Instead of a 2D video call, you’ll be able to sit in a digital room or in your real room with avatars of your friends and colleagues, allowing you to focus on who’s talking and read body language, as if you were having a meeting in the real world. The avatars are a little cartoonish at the moment, but the tech is getting so much stronger, and widespread adoption of this use case isn’t too far off.

What does Snapdragon Spaces add to the developer tools already being used for XR? Or what problem is Snapdragon Spaces solving for XR developers?

Hugo Swart: First off, let me define Snapdragon Spaces: it’s a platform that brings machine perception technologies, like Hand-Tracking, Plane Detection, and Spatial Anchors that are optimized for Snapdragon processors direct to developers. By using Snapdragon Spaces Software Development Kits for Unity or Unreal Engine, developers can create immersive experiences for AR, VR, or MR that blur the lines between our physical and digital realities, transforming the spaces around us in ways only limited by our imaginations.

As far as how Snapdragon Spaces relates to other platforms out there, it’s cross-platform and cross-device. Many of the tools available today are mainly made for one device, but with Snapdragon Spaces, the developer creates the app once, and it adapts across the hardware.

Additionally, Snapdragon Spaces extends across multiple realities: AR, VR, and MR. We follow an open-platform standard called OpenXR, which can go across devices, follows standards, and adapts to AR and MR. This enables us to do novel things like smartphone AR with the use of glasses.

Can you give me a few examples of products created on Snapdragon Spaces?

Hugo Swart: One is an enterprise AR product from a company called Sphere, which requires immersive worker collaboration with people in the field. Users can perform maintenance on a machine, and if the field-worker cannot resolve the problem, they can create a session with a more experienced person in the office, who can see the field worker’s view remotely through the AR glasses. The remote worker can annotate the field worker’s interface or even translate in real-time if the field-worker and the remote worker speak different languages.

There’s another company called Talespin that uses VR to conduct soft skills training, like conflict resolution, using role-playing with virtual humans. It’s such a rich application.

A great consumer application of XR is called Tripp, which is a meditation and well-being app. It immerses you in a calming visual environment along with guided audio that helps you to relax and focus on wellness.

One of the biggest value adds to the latest version of Snapdragon Spaces involves a more seamless connection between headworn devices and 2D smartphones. Can you elaborate on this a little bit?

Hugo Swart: Building apps for AR from scratch is a time-consuming process for developers, much more so than building ones for 2D screens. But with Snapdragon Spaces Dual Render Fusion, developers can extend their 2D mobile applications into world-scale 3D spatial experiences without any prior XR experience required.

The beauty of this functionality is that developers can distribute the UI/UX between the 2D smartphone touchscreen and the AR glasses as they desire, so mobile app developers can now explore adding AR to existing mobile applications. This lowers the barrier of adoption for the developer making AR development accessible to a wider range of developers than ever before.

One might argue that spatial computing and the metaverse breaks down human interaction. From your POV, what potential do you see XR having in connecting people?

Hugo Swart: Like I said before, I think it has the capacity to be totally transformational, bringing people around the world closer than ever.

I see a world where headsets will allow my three kids to interact with their cousins who live in Brazil and the Netherlands, who they rarely get to see. Then suddenly, it’s like they’re all in the same room.

Then on an enterprise level, we at Qualcomm work internationally. We always find so much value to be able to feel like we’re sitting in the same place, with the whiteboard, having a chat.

Can you describe the intent and the elements of the Snapdragon Spaces Pathfinder Program?

Hugo Swart: The Pathfinder Program is a sort of fellowship for developers that allows qualifying developers to foster a closer relationship with Qualcomm. Once accepted to the program, we facilitate community, provide funding, dev kits, as well as co-marketing and PR with them.

People can apply through our online dev portal. Developers have so much value, and this program is just a great way to incentivize them to come to us, and also to build a closer relationship with the ones who use what we have.

How does Qualcomm also use YouTube and Discord to build its engineering community?

Hugo Swart: Discord is a super popular tool for getting product feedback, building the developer community, and interacting across various topics. Qualcomm is using it quite heavily to facilitate cross-developer communication with the company included, and it’s proven to be very successful.

As far as YouTube, we use it to showcase what the developers are doing, essentially to promote their apps, to motivate and inspire others. Combined with the Discord, we can foster a community of developers around Snapdragon Spaces.

Join the spatial computing revolution with Qualcomm’s Snapdragon Spaces.

This post is a sponsored collaboration between Qualcomm, Inc. and G/O Media Studios.

Share this story

Sign up for our newsletters

Subscribe and interact with our community, get up to date with our customised Newsletters and much more.