Oct 1, 2018
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
Grab a headset and you can be immersed in experiences you’d never imagine and transported to places you never thought you'd see, from the front row at your favorite sporting event to the summit of Mount Everest. These experiences are enabled by the next revolutionary mobile computing platform: XR. Though still in its infancy, XR, an umbrella term for Virtual Reality (VR), Augmented Reality (AR) and all realities in between, is expected to overhaul industries and enable new experiences for our day-to-day lives.
As a leader in mobile technology, Qualcomm Technologies is committed to a future in which the best of intelligent XR features come together to support high-quality visuals, audio, and interactivity. In May, we announced the Qualcomm Snapdragon XR1 Platform — the world’s first dedicated XR platform — designed so OEMs can develop mainstream devices that provide high-quality XR experiences for everyone.
To learn more about the Snapdragon XR1 Platform and its role in the wider development and acceptance of XR, we spoke with the head of XR at Qualcomm Technologies, Senior Director of Product Management, Hugo Swart. He described what technological advancements are needed for XR to hit mainstream, how XR can affect industries, and what experiences it can enable for everyday people. Plus, he filled us in on Snapdragon XR1’s role in it all and what’s already being done with the platform today.
This interview has been edited for clarity and length.
How would you describe the term XR?
XR is an umbrella term that represents both VR and AR. Today, these are separate experiences, but one day they’ll converge. A single head-mounted display (HMD) will allow you to have both AR and VR experiences. If you think about the terms and the technology behind VR and AR, there are a lot of similarities: the way you track your body, your hands, your eyes, and some of the power constraints. The technology challenges for each are similar and often overlap, industry players and ecosystems are common and that’s why we needed a term that talks about both forms of immersive computing.
Where do you see XR devices going? What’s their future?
We see a journey where AR glasses and VR HMDs are going to evolve year over year. In five to ten years, we’ll start seeing converged products that can do both. In fact, we already see examples of AR glasses that enable some VR 360 experiences. Similarly, with VR HMDs, we’re starting to see some integrations of AR experiences with pass-through camera feeds. And further in the future, they’ll merge into one device. For that vision to materialize, we’ll need improvements in displays and optics that can offer wide field of view approaching human eye range and can alternate between transparent and opaque mode. We’ll need improvements in processing with a strong emphasis in computer vision and AI. And we’ll also need improvements in connectivity with 5G integrated, enabling processing to be split between the device and the edge of the network.
Why did Qualcomm Technologies create the Snapdragon XR1 platform?
We created the Snapdragon XR1 as a single platform that addresses the minimum requirements for both VR and AR. As I mentioned, VR and AR leverage similar technical requirements, and both demand high-quality experiences. XR1 is a cost-effective platform that addresses these technical needs tailored for XR.
We’re currently powering the premium mobile VR and AR tier with our Snapdragon 800 platform series and saw a need for a tailored solution for XR that would be cost effective for what we call the “high-quality tier.”
What technological hurdles must be overcome before XR becomes more widespread?
There are three factors we need to look at: hardware, software, and content. For hardware, it comes down to display and optics, the progress in the silicon space, and a wearable form factor. Displays are one of the biggest challenges because they need extremely high resolution, wide field of view, low-latency, and vergence accommodation among other things. At the silicon level, Qualcomm Technologies plays a big role in making more capable silicon platforms — meaning higher performance at a smaller size and lower power. All of this influences affordability and ease of use, making hardware the key for more people to have access to the technology.
With software, we need to improve algorithms like hand tracking, eye tracking, and common illumination (making real-world and virtual objects indistinguishable). If you have a pure digital image in a field of view where everything else is real, it’s going to look fake, so illumination is key for that perception of reality. Then there’s occlusion. Imagine you have a virtual object on a table in an AR experience. If someone walks in between that table and where you’re standing, the digital image needs to disappear. You shouldn’t see that visual image projected on the person in front of you. That’s what we call occlusion, and that needs to be resolved in real-time with low battery consumption.
And then, we need content that leverages all the advancements in hardware and software. Specifically, we need content that’ll be of value to consumers and enterprises that’ll drive everyday usage.
What role does artificial intelligence play in XR? Is progress in XR directly tied to advancements in AI?
Yes, and 5G too. XR, AI, and 5G go really well together. They need each other. Imagine you have this 360-degree display in front of your eyes and things are popping up while you work, walk, or drive. From a user perspective, the device needs to understand what’s relevant in a particular situation. AI provides contextual awareness. You also need AI for object recognition and for the digital overlays. If you’re walking down the street and want more information about a shop you see, you need AI to quickly recognize the shop and enable relevant pop-ups in your XR glasses. Plus, AI is going to be critical for things like hand tracking and voice input, which will rely on AI for those algorithms like natural language understanding. Using contextual awareness and AI, our glasses might know what we want or need to do next before our brains do.
5G is important because XR devices will need to have constant connection and will require extremely low latency and high bandwidth. As you move through your environment, very fast response times will be needed in order for these devices to keep up with the changes. Also with 5G, we’ll see more and more distributed computing between devices and edge of the network. AI and rendering are examples of tasks that are compute intensive and can be distributed between the device and 5G network.
What industries will be impacted by XR?
We anticipate that many, many industries will benefit from XR. Manufacturing is a great example, because there’s a lot of automation and workers can visualize all data from machines and processes as they walk the manufacturing floor. You can also think about field service technicians and how they’re dispatched to fix an appliance in someone’s home. Their XR glasses could identify the appliance, pull the instructions, and guide the technician through the possible issues, allowing someone more junior to do the repair and, if need be, get additional assistance from someone else in the office who can see what he sees. Another industry is the medical field. A doctor could use XR to detect patterns on a patient and see recommendations for what the problem may be. Or, during surgery, it could make a person’s internal organs appear overlaid on the body for easier operating or have key patient data on the doctor’s field of view. It’s difficult to imagine an industry that could not be impacted by XR.
What other experiences will XR enable?
Other industries will take XR to the everyday consumer. XR could lead to virtual classrooms, which would allow people to learn from anywhere while still having classmates and a teacher. Another example is tourism. If you want a sneak peek of a place to make sure it’s where you want to go, you could do that with XR. Or if you want to go somewhere but you physically can’t, you could still experience it through XR. Think of a trip to Rome at the Colosseum — with XR, you could see not just the ruins of today but also a digital overlay of what it looked like when it was operational.
Will XR ever be as ubiquitous as mobile phones?
Maybe more. It’s certainly possible that everyone is going to have a pair of XR glasses they wear all day, and it’s going to be the hardware that will satisfy all our compute needs. It can substitute for smartphones, PCs, and TVs — essentially anything with a display. There’ll be a period of several years when XR glasses serve as a companion device to the smartphone, but as technologies evolve and improve, we predict more and more usage shifting to the glasses form factor. That’s the vision.
Who’s using the XR1 platform? What are they doing with it?
At AWE we announced five customers that are currently developing with the XR1 platform and since have additional customers using XR1. These customers are working on a variety of form-factors from mixed reality headsets to VR headsets and developing a wide range of applications, including indoor enterprises, scenarios for design, architecture, and remote field assistance.
How do you think the Snapdragon XR1 Platform will impact the development and adoption of XR?
What was available isn’t meeting consumer needs — 46 percent of those surveyed by Technalysis Research prefer a standalone AR/VR device, but price was keeping people from accessing this technology. In total, 45 percent or survey respondents said XR devices are too expensive. We recently conducted some research to support these claims and plan to share it with the industry at an upcoming webinar. We believe that XR1 is the platform that will enable mainstream XR products to become available for everyone.
XR1 has the power to change this and bring high-quality experiences to the masses. From playing games and watching sports to capturing and sharing their own experiences, XR and XR1 will break down the barriers. By 2023, it’s estimated there will be 186 million standalone AR and VR devices in use.*