As sensors and cloud computing become more pervasive, our mobile devices and environments will become context-aware, accessing public and private information such as e-mails, GPS locations, weather, and transit schedules to deliver smarter mobile services to consumers.
Imagine a world where your phone has really become your personal assistant, able to detect your needs and anticipate your next move. Let’s say your friend Eva sends you a text message reminding you that you’re having dinner with her after work. It’s 7 p.m. and you’re in a rush to leave the office to make it to your 7:30 p.m. restaurant reservation.
By having access to your messages and schedule, and detecting that you’re headed for the train station, your phone determines that you’re on your way to meet Eva. What you didn’t know was that there are massive delays in the subway. But because your phone also has access to your favorite public transit app, it alerts you that your train is running late so taking a cab is your best bet. Phew! A mini-crisis is averted. Now you can focus on the things that really matter.
This scenario requires your phone to have access to a lot of different information and types of data. This growing demand for applications that understand our context drives a need to analyze a wider variety of data sources. This, in turn, requires a shift from platforms that offer “sensor fusion” to those that feature “data fusion.” And as we’ll see, data fusion solutions will power the next generation of smarter devices and services.
Sensor Fusion Falls Short
Sensor fusion intelligently combines and processes data streams from multiple sensors so the output result is greater than the sum of individual sensor results. It eliminates deficiencies of individual devices and provides a synthesized smart output from a combination of accelerometers, gyroscopes, and magnetometers. The signals are consumed and processed simultaneously to detect device orientation and enable compelling mobile applications such as games, tilt-compensated e-compasses, augmented reality, and more (see “Augmented Reality Will Be Here Sooner Than You Think”).
Sensor fusion can provide a very accurate 3D orientation based on inertial sensor data. Many other features then can be built on top of this data. The technology raises awareness of the power of using sensors in combination and is a market driver for devices using combinations of microelectromechanical systems (MEMS) in mobile devices.