Jan 18, 2018
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
Imagine what it’ll be like when you can blur the lines of your physical and digital worlds, allowing you to experience unrestrained virtual reality (VR) and enhance what you see around you with augmented reality (AR).
As we discussed at the Augmented World Expo earlier this year, "XR" is an emerging umbrella term that is already being used to encapsulate AR, VR, and everything in between. XR is a mobile market that’s gaining momentum as VR and AR markets may combine to create a $108 billion market by 2021, according to TechCrunch.
Recognizing this, we’ve launched over 20 XR devices between standalone HMDs and XR-capable smartphones with our customers, and many more are currently under development. We’re working closely with some of the world’s most influential XR companies like Google, HTC Vive, and Oculus and are supporting Facebook, Samsung, ODG and many others. In addition, we have our HMD Acceleration Program (HAP) that supports customers to quickly commercialize using a reference design, qualified ODMs, and technology collaborators.
To make this possible, our new Qualcomm Snapdragon 845 Mobile Platform integrates the latest Qualcomm Adreno 630 visual processing subsystem architecture which introduces outstanding integrated graphics, video, and display processing technologies with cutting-edge performance. This can change the XR landscape by transforming entertainment, education, and social interaction into highly immersive experiences featuring intuitive interactions, stunning visuals, and realistic sounds.
Adreno 630 also features tremendous power efficiency, delivering 30 percent faster graphics performance, 30 percent better power efficiency, and more than twice as much display throughput compared to our previous generation.
Room-scale 6DoF with SLAM
Snapdragon 845 is the first mobile platform to enable six degrees of freedom (6DoF) head tracking with simultaneous localization and mapping (SLAM).
When you’re experiencing VR, you must be able to create and update a map of your physical environment in real-time, while simultaneously keep track of your location within it.
Utilizing room-scale 6DoF with SLAM on the Snapdragon 845, you can see and avoid walls and obstacles in confined areas through advanced algorithms. You can also understand the room’s size, scan for objects in the room to avoid them, and integrate real-world objects into your virtual world. If you’re a “Trekkie”, you’ll notice some similarities to the Enterprise’s holodeck.
Delivering up to 4 million pixels per eye, SLAM is managed across several heterogeneous engines including the Qualcomm Spectra 280 ISP, Qualcomm Hexagon 685 DSP, and Qualcomm Kryo 385 CPU on the Snapdragon 845.
SLAM is also an essential technology for AR to accurately position virtual objects in tracked scenes, helping to ensure correct visualization and positioning in your real-world environment.
Accelerating hand-tracking and voice commands with the tech ecosystem
We collaborate with companies like Leap Motion to achieve the best hand tracking accuracy at the lowest latency possible on Snapdragon 845. We recently demonstrated that with Think F.A.S.T., a cutting-edge Virtual Reality (VR) application, which showed how VR technology can be utilized to educate people about stroke and ultimately, to help save lives. The app features advanced hand tracking that lets you see your hands as you operate an intuitive interface display and make selections as you progress through your training module.
Snapdragon 845 also simplifies your XR experience via voice commands by listening to your command key words. During Think F.A.S.T. training, your voice inputs allow you and your patient to communicate. Think F.A.S.T. is a great example how education and enterprise training will be taken to the next level with VR using Snapdragon 845.
Engineering Adreno Foveation
Snapdragon 845 introduces Adreno Foveation, a digital image processing technique that substantially reduces power consumption, boosts XR application performance, and improves visual quality within your small fovea region where it counts, as compared to the previous generation.
Adreno Foveation employs a unique kind of graphics rendering combined with eye tracking to understand where your eyes are looking and then drives graphics resources to enhance that area for sharpest visuals. For example, it’ll render objects with progressively less detail outside your fovea region, and then it’ll render objects within your fovea’s field of view with much greater detail.
In the above video, Adreno Foveation reduces the GPU’s workload by prioritizing image resolution based upon the user’s fixation point.
Adreno Foveation consists of multiple technology advancements: Multi-view rendering, tile-based foveation, and fine grain preemption. Scroll to learn more.
Multi-view rendering - Rather than each eye buffer being rendered independently which is currently the conventional rendering standard for today’s VR HMD, objects are first rendered to the left eye buffer, and then automatically duplicated to the right buffer, with appropriate modifications for position and view-dependent variables like reflections.
Tile-based foveation – We’ve developed this with our software collaborators to conceptually break up your scene into tiles for processing. The level of detail will be greatest in the tiles where you’re directly looking and the surrounding tiles will be less clear, with the tiles at the edge of the scene receiving the least amount of detail.
Fine grain preemption – This is a means of efficiently interrupting a process that’s currently executing to initiate a higher priority process with minimal delay. It also efficiently resumes with the previous process once the interrupting process is completed. This provides improved results, particularly with real-time, latency-critical XR applications.
New directions in augmented reality
Voice UI will be important for the next generation of XR experiences, especially for AR. Upcoming AR glasses can have an always-listening component with the ability to recognize key phrases like, “Hey Snapdragon” to either wake up the glasses or invoke a specific action. Future AR glasses may also use Snapdragon computer vision and facial recognition technologies along with a trained, on-device neural network, to identify people in the room. With that contextual information, apps for AR glasses could either recommend actions, or automatically take actions, like displaying useful information about the people in the room.
AR could also change the way you shop and make mobile payments. During last month’s Snapdragon Tech Summit, we worked with ODG to demonstrate the use of glasses in a virtual shopping environment. The demo illustrated how you could add items to your cart by simply looking at them and tapping your glasses. When it was time to pay, the glasses performed iris authentication to scan your eyes, confirm your identity, and release your funds for payment.
Exploring 3D audio
Snapdragon 845 also supports 3D audio to further immerse you in VR environments, allowing you to hear sounds like you’d hear them in real life. For example, 3D audio reflects the physical composition of your environment. If you’re standing in a virtual cave, you’ll hear the reverb and reflections of your voice just as you would in a real-world cave.
Adreno 630 represents a truly dynamic visual processing subsystem and we’re very excited about the new experiences Snapdragon 845 can deliver to you in 2018.