Although the concept of sensor fusion has been around for a while, we’re beginning to see real-world deliverability around the premise. Indeed, sensor fusion is rapidly evolving into a hot trend, with roots in mobile phone and portable technology, now beginning to spread into context awareness for IoT sensors, the next generations of autonomous road vehicles, and drones.
However, this explosive growth brings a number of challenges as well as opportunities, not only from a purely technical standpoint, but also touching on privacy, safety, and even wider implications for future infrastructure development.
Sensor fusion has been around for some time, but has been evolving at an accelerated rate in recent years.
In definition, sensor fusion is relatively simple, essentially being software that intelligently combines data from a range of sensors, then uses the results to improve performance. This could be either using arrays of the same—or similar—types of sensor to gain an extremely accurate measurement, or by collating different types of sensor inputs to achieve something more complex.
It doesn’t take an enormous amount of imagination to envision the potential applications of sensor fusion, and indeed the analysts are bullish. One recent report predicted that sensor fusion system demand is expected to grow at a CAGR of roughly 19.4 percent over the next 5 years, to reach 7,580 million US$ in 2023, from 2,620 million US$ in 2017. Perhaps predictably, North America is the largest producer, with nearly 32.84 percent of the market in 2016, while Europe snaps up more than 31.51 percent in 2016.
While traditional uses of sensor fusion tended to lie with the more industrial applications, recent months have seen a major shift in the customer base. A massive 54.86 percent of the sensor fusion system market demand was generated by the consumer electronics industry in 2016.
This enormous demand is being driven by the increasing utility of the sensor hub, a piece of MCU hardware, which implements a specific sensor fusion algorithm for a specific set of sensors—as opposed to the pure-play software ‘sensor fusion,’ which can involve a vast range of sensor types and also algorithms. The hardware-based sensor hub lifts a significant burden from the system CPU, which in itself is desirable in a host of modern devices, from smartphones to wearables. Indeed, lowering CPU load can extend battery life and reduce heat, both key enemies of the wearable and smartphone designer.
For instance, Google introduced the Android sensor hub, which is designed to hook directly into phone sensors, such as biometric readers, accelerometers, and gyroscopes. The tiny processor, running Google’s custom algorithms, can then independently interpret gestures and activities without consuming resources from the main CPU.
The hub has been integrated in hundreds of thousands of Android and Apple phones to date, and as part of the Qualcomm Snapdragon chipset, has made its way into a myriad of wearables and more recently smart home devices, where in many cases battery life concerns are also paramount.

Figure 1: Robosense lidar.
Another key market for sensor fusion is the automotive industry, for example in car collision systems, where a range of different sensors including pressure sensors, accelerometer, gyro, and ultrasonic may be employed. If combinations of sensors in this group breach thresholds then various corresponding responses—such as deploying relevant airbags—can be made automatically. Most current Level 3 vehicles rely on sensor fusion to integrate lidar with visual and IR cameras, ultrasonic sensors, and radar arrays, to name but a few. These sensors are generating up to tens of millions of points per second, which need to be processed in real time for obvious safety reasons.
As the industry aims to progress toward Level 4 and 5 vehicles, requiring less driver interaction (needing none at all at Level 5), the reliability of the sensors, the sensor fusion hardware/software, and finally the processors need to be continuously improved to a far higher level than mobile phones and wearables—a smartwatch rebooting itself is a different matter to the collision system in a Level 5 autonomous vehicle on a motorway. System-wide reliability is a complex challenge, as while sensor fusion offers faster, more effective monitoring of environmental variables, it will also mean that rogue inputs from a single sensor can potentially trigger safety systems, forcing automotive designers to ensure that corroboration and redundancy is baked in to every part of the system.
Perhaps the most vital benefit of sensor fusion for the future is the context that it provides, taking disparate inputs and interpreting their meaning within the environment. One example here is lidar and radar systems in vertical applications, such as drone systems, where pressure sensors are a vital tool for flight control and positioning in situations where GPS signals are unreliable.
This sensor fusion-derived context is particularly important in the world of IoT and smart home/smart city development, where thousands of ‘dumb’ sensors, once networked, can be built into city-scale responsive systems once successfully fused. There are of course privacy and security concerns here if in-home data is used in an identifiable way, and wider public security concerns for city wide systems. However, early test networks designed to monitor city air pollution—such as benzenes and particulates—through a network of vehicle-mounted, building integrated and purpose-built monitoring stations, and then issue automated alerts and modify traffic flows have shown great promise.
In just one example, a July 2018 project to monitor pollution levels across London brought together 100 fixed sensors in the worst areas, along with two specially modified Google Street View cars that tracked pollution levels as they travelled the streets. The two Google Street View cars are taking air quality readings every 30 m, with the aim of flagging pollution “hotspots” by overlaying data over the course of a year.
There are many challenges here, of course, not least the difficulties in reliably responding to variations in environmental conditions, on top of the challenge of extracting meaningful data from a wide range of sensors in varying implementations, any of which potentially add device error, noise, and flaws in the data gathering process.
Smoothing out these errors was a near-impossible task just a few years ago, but with the rise of relatively affordable machine learning and AI toolsets, the potential to deliver real benefits from sensor fusion has grown exponentially. The promise of AI technology goes much further, of course, generating new use cases and therefore new markets for sensor suppliers and designers. In the shorter term, AI and sensor fusion can minimize security risks by providing enhanced local processing of data, significantly reducing the requirements to securely transmit, process, and store personal data offsite. This may well become a vital value proposition, reducing business risk and overhead costs as well as providing a clear benefit to the end user.
It is clear that the future will see increasing numbers of connected sensors, embedded throughout our vehicles, homes, and cities. The necessity to add context to these burgeoning streams of data will only become more pressing. However, once given that context, the data will be used to enable a whole new world of incumbent and startup services, from consumer-facing health and leisure enhancements, through to improved efficiency in supply chain management and quicker, easier less polluting transport networks.