Jan 7, 2021
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
The opportunity for touchless human machine interaction (HMI) and gesture control technology is undergoing huge growth and adoption, with some estimates predicting the opportunity could be worth around $30.6 Billion by 2025. There are numerous factors driving this growth including the evolution of technology, the demand for touchless IoT systems (e.g., for hygienic reasons during the pandemic), and a general push towards untethered experiences, especially in cross reality (XR), virtual reality (VR), and augmented reality (AR).
Within this segment, hand tracking mechanisms in particular, are evolving rapidly. Hand tracking captures movements of the user’s hands as digital data that developers can use for HMI (e.g., to build touchless user interfaces). Hand tracking is also a key component of spatial computing, which takes advantage of 3D space (e.g., in XR) to present user interfaces that more closely align with how humans naturally interact with our real, 3D world.
Over the last year, a team from Qualcomm Technologies Inc. (QTI) has been working closely with Ultraleap (formerly Leap Motion and Ultrahaptics), an ecosystem player and a world leader in tracking and haptics technologies, to integrate Ultraleap’s hand tracking into our XR solutions.
In this blog we’ll provide a brief introduction to the ins and outs of hand tracking, take a look at Ultraleap’s innovative solutions, and explore how the collaboration between Ultraleap and QTI can benefit XR developers.
Reaching into a Whole new World
Spatial computing allows user interactions to go beyond traditional computer peripherals and touchscreens, as the environment near and around the user can become a canvas for interactions and sophisticated user interfaces. In this canvas, interactions are primarily made up of direct interactions with virtual objects. (e.g. the act of pushing a virtual button). Manipulating virtual objects with our hands, which draws on a lifetime of physical experience, minimizes the learning curve for new users.
Hand tracking also makes virtual interactions more functional, realistic, and immersive, in real-time. Objects can be manipulated with a broad range of push, pinch, and grab interactions – whichever feel most natural to the user. Virtual objects resemble physical ones, and convey their use clearly through affordances (i.e. properties of objects that communicate to users how the objects can be used). For example, grooves in the surface of a ball encourage users to place their fingers there to perform a pickup interaction. These interactions can also be supplemented by abstract poses or gestures such as fingers out, palm up to activate a menu, or holding up a certain number of fingers to trigger scripted actions.
As a result, a number of new and compelling interactions are now possible, many of which are particularly well suited for controlling casual interfaces and menus, physically interacting with virtual objects, and expressing oneself in social VR. Some examples include:
Pushing virtual buttons or other UI elements using fingers
Pinching virtual objects for precise physical control (e.g., pinch and release a menu item, stretching or pulling a small target, object, or interactive anchor on a larger object)
Objects that can be grabbed, pushed, thrown, picked up, moved, dropped, torn in pieces and much more in whatever way feels most natural to the user
Virtual objects can be manipulated in ways that are not possible with physical objects, such as re-sizing objects with their hands or storing them in infinite, virtual pockets
Distinctive, dedicated abstract hand poses or gestures to perform specific functions such as opening a menu
Hand tracking has the added benefit of making applications more accessible to people who may not be used to working with traditional peripherals (e.g., hand-held VR controllers). Hand tracking can significantly reduce the learning curve and means that anyone can reach in and immediately interact intuitively with virtual content. A great example is the use of Ultraleap’s technology to allow visitors of all ages to interact at a theme park attraction.
Elements and Challenges of Hand Tracking
Hand tracking requires two fundamental elements: sensors to track the joints of the hands, and software to transform and analyze the sensor data. Over the years, data collection has been accomplished through various peripherals including sensored gloves and handheld controllers, but neither offers an experience that is as natural as simply moving one’s hands and fingers. Today’s state-of-the-art camera-based hand-tracking solutions help address this.
A camera-based hand tracking solution captures images of the user’s hands and fingers while using different types of algorithms to analyze and convert the stream into the orientations and positions of the joints and bones across frames (e.g., depth maps, 3D images, etc.).
Hand tracking is inherently complex due to the number of elements which must be identified and tracked in real-time. Each hand contains 27 distinct bones and joints, which can all move at the same time. Objects in the real world or even issues with lighting can obstruct hands from the camera, or a user’s hands can simply move out of the camera’s field of view.
Working to overcome these issues while maintaining high performance, and low motion-to-photon latency (i.e., the lag between when the user moves their hand(s) to when that movement is rendered on screen) are key challenges that a hand tracking solution should overcome, especially to reduce cybersickness.
Ultraleap’s initial hand-tracking solution was their Leap Motion Controller, an innovative device equipped with two infrared cameras and multiple LEDs. Users simply move their hands over the device, while the LEDs illuminate the hands with infrared light. The device’s infrared cameras then record that light, turning it into hand tracking data:
The Leap Motion Controller can also be used for XR, by mounting it onto an XR headset so that the device’s LEDs and cameras orient and adhere to the user’s head movements.
The algorithms for analyzing this data and converting it into hand tracking information are provided by Ultraleap’s hand-tracking platform (currently on its fifth generation, called Gemini). Developers can access the information provided using Ultraleap’s SDKs, which are available for numerous languages and platforms including Java, C++, Unity, and others. The Unity plugin includes Ultraleap’s powerful Interaction Engine, a high-level set of abstractions through which developers can build physics-based user interfaces such as 3D menus that can be used in spatial computing.
The Stereo IR 170, is Ultraleap’s next-generation optical hand-tracking module with a wider field of view and longer tracking range. The Stereo IR 170 is designed to be integrated into hardware solutions, displays, installations, and virtual/augmented reality headsets for AR/VR/XR.
Given the success of Ultraleap’s past solutions, Ultraleap has now made Gemini available separately from their hardware for the first time, and we’ve been working with Ultraleap to integrate it into the Qualcomm® Snapdragon™ XR2 5G Platform.
This will help bring class-leading hand tracking to our current premium XR platform, allowing XR developers to build their own untethered devices that take advantage of Gemini. Developers have the option to build devices around the Ultraleap Stereo IR 170 or use different camera systems such as the external cameras found on the Snapdragon XR2 HMD Reference Design.
Hand tracking plays a vital role in HMI today by opening up new ways of interacting. The integration of Ultraleap’s Gemini platform into the Snapdragon XR2 5G Platform will now put class-leading, high-performance hand tracking functionality into the hands of many more XR developers.
For additional hand tracking use cases, check out the following from Ultraleap:
Industrial design with R3DT
For additional information be sure to check out the following resources:
Qualcomm Snapdragon is a product of Qualcomm Technologies, Inc. and/or its subsidiaries.