Search this blog...

Sensors on Android Gingerbread

Android 2.3 (codename Gingerbread) was officially released amidst huge hype and fanfare last week and BOY O BOY!!  people sure are queuing-up to have a peek. Sensors were the most hyped about sub-system.

Just google-search "Android Gingerbread" & you will see a host of results of this pattern:

The big news in Android 2.3 Gingerbread
Support for new sensors

Android-Fans sure are blogging everywhere about the enhanced support for "NEW" sensors on gingerbread. But having worked on Android sensors since the days of cupcake, I beg to differ...

Gingerbread does NOT support any NEW sensors!
Here is what Android has to say in the official Gingerbread release notes.
Native input and sensor events
Applications that use native code can now receive and process input and sensor events directly in their native code, which dramatically improves efficiency and responsiveness.
Native libraries exposed by the platform let applications handle the same types of input events as those available through the framework. Applications can receive events from all supported sensor types and can enable/disable specific sensors and manage event delivery rate and queueing.
Gyroscope and other new sensors,
for improved 3D motion processing
Android 2.3 adds API support for several new sensor types, including gyroscope, rotation vector, linear acceleration, gravity, and barometer sensors. Applications can use the new sensors in combination with any other sensors available on the device, to track three-dimensional device motion and orientation change with high precision and accuracy. For example, a game application could use readings from a gyroscope and accelerometer on the device to recognize complex user gestures and motions, such as tilt, spin, thrust, and slice.


That IS quite a mouthful. But stripping-off the marketing-spiel we can say:
  1. Gingerbread provides sensors-support to native C/C++ apps.
  2. Gingerbread provides more accurate and precise sensor-data.
  3. Gingerbread provides APIs to recognise complex user gestures.
  4. Gingerbread supports gyroscope and barometer.

While points [1], [2], [3] do mention tremendous improvements over FroYo, none of them has to do anything with any new sensors. Moving on to [4] we see the first mention of supposedly new sensors. But, here are two things that most people overlook:
- Both Gyroscope and Barometer (i.e. pressure) sensors were already available in previous releases of Android.

- The 4 "NEW" sensors (shown below) are just wrapper-APIs around existing hardware. They just provided "easy-to-digest" data.

Real Sensors map 1-to-1 to actual Hardware. Data of Virtual Sensors, on the other hand, is exported to the apps by performing calculations on 1(or more) real sensors.
These newly introduced wrapper-APIs process the raw sensor data into a format ready to use by the Android apps. This proves especially useful to the apps doing advanced 3D math. ( read Games ;) )

Gyroscope sensor was supported in FroYo and so was Barometer. Since it was early days for Android sensors not much attention was paid to them. Maybe even added as an afterthought to the existing array of Accelerometer,Compass,Orientation. 

With Android apps ( again read Games ;) ) really pushing them to the limit, the limitations of a "pure" accelerometer device became evident.

Step-In INVENSENSE...

Founded in 2003, InvenSense is based in Sunnyvale, California, Invensense is a market-leader in advanced MEMS gyroscope design. Their latest offering (based on SensorFusion technology) is a MotionProcessing library i.e. MPL on Android Gingerbread.

Apart from the rudimentary API which Android provided, Invensense Motion-Processing Library(MPL) sits alongside the Sensor-HAL and provides a feature-rich API to obtain Gestures, Glyphs & Pedometer data from sensors.

All this data is derieved from a combination of Accelerometer/Gyroscope/Compass hardware modules. The MPL processes the individual data and combines them appropriately to overcome the individual limitations of each sensor & provide an overall better stream of precise & accurate (processed)samples. Also Advanced operation/Pattern-matching & count is done by the MPL and any app can then directly obtain data pertaining to gestures or step-counts(pedometer) etc. using the MPL APIs.

Here is a "short" video by David Sachs(Invensense Tech) which explains the advantages of INVENSENSE MPL extensions on Android...



To conclude, one can say that the Sensor sub-system has undergone a huge overhaul  in Gingerbread. And one can only hope that what it delivers is well worth all the hyped-up expectations.

_________________________


Related Posts:


No comments :

Post a Comment