Effective October 1, 2012, QUALCOMM Incorporated completed a corporate reorganization in which the assets of certain of its businesses and groups, as well as the stock of certain of its direct and indirect subsidiaries, were contributed to Qualcomm Technologies, Inc. (QTI), a wholly-owned subsidiary of QUALCOMM Incorporated. Learn more about these changes

You are here

Sensors

Qualcomm’s visual inertial sensor fusion effort was initiated with the intent of developing a robust implementation of the visual inertial tracker that is found on most mobile phones today.

The visual inertial tracker provides an estimate of inertial sensor calibration parameters such as biases, gravity vector, location of feature points, and the pose of the phone in space utilizing inertial sensor and the camera measurements.

As part of this effort, a new filter has been designed to function in a wide set of feature sparse environments enabling a variety of use cases. For example, the filter can be initialized with only a very small number of feature points in the camera view and can continue to track over short periods of time typically when all feature points are lost. The filter can also address non-stationary feature points and feature points on T-junctions.

In addition, the system has been designed by Qualcomm Research to support applications in real-time at 30Hz. To support this, the tracker run-time is optimized to utilize less than 1 core of a modern smartphone CPU.

There are also additional applications which are envisioned to be part of this project. Some of these include inertial sensor calibration, camera applications (e.g. de-blurring that requires the motion trajectory of the phone), augmented reality games (e.g. augment planes on mobile phone screens on games such that they provide, for instance, the appearance of planes attacking as the user tries to shoot them down), and indoor navigation.