Oct 11, 2018
Qualcomm products mentioned within this post are offered by Qualcomm Technologies, Inc. and/or its subsidiaries.
In our previous blog, Tips to Power Through Sensor Development Challenges, we saw how sensors provide approximate measurements of real-world phenomena. These different types of sensors can exhibit inaccuracies from various things such as its design, environmental conditions, and others. We also touched upon solutions for overcoming these challenges such as sensor fusion–the practice of gathering and processing data from multiple sensors.
In this blog we’ll dive deeper into sensor fusion to look at some use cases and explore implementation options.
As developers, our goal is to improve upon the often noisy and inaccurate data returned from sensors. Sensor fusion is a common solution that utilizes multiple, and often different types of data fed into algorithms to produce better data. It turns out that sensor fusion can also do a lot more.
With the right algorithm, sensor fusion can aid in making predictions, generate inferences from incomplete data, introduce redundancy and fault tolerance, extrapolate human-like and contextual information, and set the stage for more sophisticated data analysis.
Applications for sensor fusion span multiple areas of technology including business, robotics, IoT, gaming, and transportation. For example, a game can fuse data from sensors measuring a player’s heartbeat and movements to infer their current mood. An app can approximate a GPS location when GPS coverage is lost by using other sensors and past historical movements. And spatial coverage of a fighter jet’s radar can be extended by fusing radar data from other nearby jets flying in formation. The characteristics of sensor fusion can be categorized in different ways.
Data sources for fusion can be direct or indirect.
Fusion processes can be categorized as low level, intermediate, and high level.
A sensor configuration can also be categorized as complementary or competitive.
A fusion algorithm is the key component in sensor fusion because it takes the sensor data as input, then outputs refined data that is often more informative, accurate, and useful than the data from any individual input. Algorithms can vary by the number of inputs and outputs and can be chained together such that each successive algorithm further refines the data.
An algorithm may involve one or more of the following phases:
Fusion algorithms are sometimes based on formal fusion models such as “JDL Fusion” by the US Joint Directors of Laboratories, and can be implemented on platforms like the Qualcomm Snapdragon 845 and its Qualcomm Kryo CPU or Qualcomm Hexagon DSP. Implementation on the Qualcomm Adreno GPU through the Adreno GPU SDK is also good option since fusion often involves matrix math.
A common sensor fusion algorithm is the Kalman Filter. It uses data measurements from multiple sources (e.g., sensors) acquired over time that are often noisy and inaccurate, and estimates values for variables (e.g., future GPS locations) that are more accurate than would be possible using any single measurement. It does this in two steps:
Developers use the Kalman Filter to extract relatively accurate information when there is uncertainty (e.g., in scenarios where things are constantly changing), and also to reduce noise, bias, and accumulation errors. For example, a Kalman Filter can be used to estimate the position of an object over time when the GPS signal is lost using other sources such as the accelerometer, gyroscope, and compass sensors, along with historical data.
Since the algorithm runs recursively and only requires current measurements, the last estimated state, and known uncertainties; it’s well suited for implementation as a run-time process on platforms like Snapdragon 845, and can be further optimized for power efficiency using the Snapdragon Power Optimization SDK. The Kalman Filter can also play a role in Machine Learning as described in this whitepaper and can be implemented using our Qualcomm Snapdragon Math Libraries. Machine Learning and associated algorithms can also be accelerated on Snapdragon-based edge devices using the Qualcomm Neural Processing SDK for AI.
Since data collected from sensors approximates real-world phenomena and is often imperfect, it must be fused with other data to be meaningful in digital systems. Sensor fusion allows us to combine data from multiple sources to generate refined data. With algorithms such as a Kalman Filter running on platforms like the Snapdragon 845, developers can overcome imperfect data and even provide accurate predictions on IoT and mobile devices.
Where are some areas where you think Sensor Fusion can help your development? Be sure to let us know!