According to Mamms Consulting, sensor fusion can be integrated into multiple sensors for data analysis and is rapidly being used in smartphones, wearables, automotive and the Internet of Things (IoT). Sensor fusion also injects vitality into emerging technology applications such as augmented reality (AR) and virtual reality (VR), autonomous vehicles.
First, it's important to understand the difference between sensor fusion and sensor hub. Sensor fusion takes data from all sensors and then cross-references multiple sources using software algorithms to create a consistent image. For example, it integrates data from accelerometers and gyroscopes to provide motion context awareness for fitness-trackable wearable devices.
Sensor hubs, on the other hand, use sensor fusion to provide information and transform it into meaningful content. It is usually a microcontroller (MCU) or microprocessor (MPU) that performs specialized tasks such as the step counter function in the pedometer.
In the past few years, sensor fusion technology and sensor hubs have changed the design of smartphones, tablets, wearables, game controllers and IoT devices. It has become an essential element in optimizing sensor architectures and is dedicated to creating new experiences for mobile users.
It all started with smartphones that began sorting sensor data, such as accelerometers, gyroscopes, and magnetometers, for navigation and activity monitoring applications in the post-Apple era. Since then, sensor fusion has been trying to create different user experiences by coordinating different sensor combinations across multiple mobile platforms.
Smartphones and other wearable devices
Almost all high-end Android smartphones now incorporate sensors as the connecting hub for accelerometers, gyroscopes and other sensors. First, in buildings where GPS signals are unavailable or poor, smartphone manufacturers are trying to improve GPS location tracking and perception using sensor fusion.
So it's no surprise that most new smartphones are designed with multiple sensors for accurate indoor positioning.
In the mobile space, the next frontier of sensor fusion technology is the development of wearable devices. Here, sensor convergence is becoming a key technology for wearable devices across the fitness, healthcare and consumer markets.
Take Qualcomm's Snapdragon Wear 2500 chip as an example. This chip is designed for children's smart watches. It uses sensor fusion technology to provide position tracking instead of relying on a separate GPS device. The chip has a built-in sensor hub and is pre-integrated with sensor algorithms that allow wearable device manufacturers to add additional sensors.
Power consumption is a key issue in compact wearable designs that continue to drive the development of energy-efficient algorithms. At the same time, sensor fusion algorithms play a vital role in promoting wearable devices for active motion, clinical trials, and AR/VR electronics with unprecedented precision and accuracy.
In sensor fusion design, algorithms for maintaining position, orientation, and attitude perception lay a critical foundation for complex data analysis. For example, positioning and tracking algorithms can infer incomplete data, introduce redundancy and fault tolerance, and infer relevant information like humans.
It is worthy of Mamms Consulting that designers working on tracking and navigation system algorithms often create some internal software tools that are difficult to maintain and reuse. As a result, companies like MathWorks offer toolsets that allow engineers to design, simulate, and analyze systems with multi-sensor data fusion.
MathWorks' sensor fusion and tracking toolkit enables engineers to explore multiple designs without having to write a custom library. This helps the algorithm to correlate data and evaluate the fusion architecture using real and synthetic data. In addition, there are multi-target trackers, sensor fusion filters, and motion and sensor models that complement the toolset.
Multi-platform radar detection generation with MathWorks Sensor Fusion and Tracking Toolbox
The tool set facilitates data synthesis for active and passive sensors, including radio frequency (RF), acoustic, infrared, GPS, and inertial measurement unit (IMU) sensors, as well as tools for generating scenes and trajectories. It also extends MATLAB-based workflows to help engineers develop accurate sensing algorithms for sensor fusion systems.
Sensor chip supplier TDK InvenSense also offers sensor fusion algorithms and runtime calibration firmware. This, in turn, not only eliminates the need for discrete components, but also ensures that the calibration procedure and the sensor fusion algorithm complement each other to provide accurate and absolute positioning.
Similarly, sensor supplier Bosch Sensortec has partnered with InterDigital's software company Hillcrest Labs to provide a one-stop sensor fusion solution. For example, Bosch's BNO080 and BNO085 modules integrate a three-axis accelerometer, a three-axis gyroscope and a magnetometer, as well as a 32-bit Arm Cortex-M0+ microcontroller running Hillcrest's SH-2 firmware.
Bosch BNO085 System-in-Package (SiP) Module Runs Hillcrest's SH-2 Firmware
The SH-2 includes MotionEngine software, which provides advanced signal processing algorithms to process sensor data and provides accurate real-time 3D positioning, heading, calibration acceleration, and calibrated angular velocity. MotionEngine software uses advanced calibration and sensor fusion technology to convert individual sensor data into motion applications such as motion tracking, context awareness, advanced gaming and head tracking.
Sensors are almost ubiquitous, making sensor fusion an important part of mobile device design. As a result, sensor fusion technology will continue to evolve with the advent of new applications for smartphones, wearables and other mobile devices.
Sensor fusion has been in the design of smartphones and other mobile devices for more than a decade, but it turns out that sensor fusion can still play a more important role. For example, sensor fusion technology plays an important role in improving noise and inaccuracy in sensor return data. Especially for the current popular smart sensor voice activation platform, this will greatly reduce processing time. In addition, Bluetooth-enabled smart headsets and wearables add end-to-end support for Alexa voice services.