http://indure.chinaunix.com/space.php?uid=20723576&do=blog&id=1887152


Sensors on Android 2.3 - Gingerbread


Sensors on Android 2.3 - Gingerbread

Android 2.3 (codename Gingerbread) was officially released amidst huge hype and fanfare last week and BOY O BOY!! people sure are queuing-up to have a peek. Sensors were the most hyped about sub-system.

Just google-search "Android Gingerbread" & you will see a host of results of this pattern:

The big news in Android 2.3 Gingerbread
Support for newsensors

Android-Fans sure are blogging everywhere about the enhanced support for "NEW" sensors on gingerbread. But having worked on Android sensors since the days of cupcake, I beg to differ...

Gingerbread does NOT support any NEW sensors!
Here is what Android has to say in the official Gingerbread release notes.
Native input and sensor events Applications that use native code can now receive and process input and sensor events directly in their native code, which dramatically improves efficiency and responsiveness.
Native libraries exposed by the platform let applications handle the same types of input events as those available through the framework. Applications can receive events from all supported sensor types and can enable/disable specific sensors and manage event delivery rate and queueing.
Gyroscope and other new sensors,
for improved 3D motion processing
Android 2.3 adds API support for several new sensor types, including gyroscope, rotation vector, linear acceleration, gravity, and barometer sensors. Applications can use the new sensors in combination with any other sensors available on the device, to track three-dimensional device motion and orientation change with high precision and accuracy. For example, a game application could use readings from a gyroscope and accelerometer on the device to recognize complex user gestures and motions, such as tilt, spin, thrust, and slice.


That ISquite a mouthful. But stripping-off the marketing-spiel we can say:
  1. Gingerbread provides sensors-support to native C/C++ apps.
  2. Gingerbread provides more accurate and precise sensor-data.
  3. Gingerbread provides APIs to recognise complex user gestures.
  4. Gingerbread supports gyroscope and barometer.

Real Sensors map 1-to-1 to actual Hardware.
Data of Virtual Sensors, on the other hand,
is exported to the apps by performing
calculations on 2(or more) real sensors.
While points [1], [2], [3] do mention tremendous improvements over FroYo, none of them has to do anything with any new sensors. Moving on to [4] we see the first mention of supposedly new sensors. But, here are two things that most people overlook: - Both Gyroscope and Barometer (i.e. pressure) sensors were already available in previous releases of Android.

- The 4 "NEW" sensors (shown alongside) are just wrapper-APIs around existing hardware. They just provided "easy-to-digest" data.

These newly introduced wrapper-APIs process the raw sensor data into a format ready to use by the Android apps. This proves especially useful to the apps doing advanced 3D math. ( read Games ;))

Gyroscope sensor was supported in FroYo and so was Barometer. Since it was early days for Android sensors not much attention was paid to them. Maybe even added as an afterthought to the existing array of Accelerometer,Compass,Orientation.
With Android apps ( again read Games ;)) really pushing them to the limit, the limitations of a "pure" accelerometer device became evident.
Step-In INVENSENSE...

Founded in 2003, InvenSense is based in Sunnyvale, California, Invensense is a market-leader in advanced MEMS gyroscope design. Their latest offering (based on SensorFusion technology) is a MotionProcessing library i.e. MPL on Android Gingerbread.

Apart from the rudimentary API which Android provided, Invensense Motion-Processing Library(MPL) sits alongside the Sensor-HAL and provides a feature-rich API to obtain Gestures, Glyphs & Pedometer data from sensors.

All this data is derieved from a combination of Accelerometer/Gyroscope/Compass hardware modules. The MPL processes the individual data and combines them appropriately to overcome the individual limitations of each sensor & provide an overall better stream of precise & accurate (processed)samples. Also Advanced operation/Pattern-matching & count is done by the MPL and any app can then directly obtain data pertaining to gestures or step-counts(pedometer) etc. using the MPL APIs.

Here is a "short" video by David Sachs(Invensense Tech) which explains the advantages of INVENSENSE MPL extensions on Android...



To conclude, one can say that the Sensor sub-system has undergone a huge overhaul in Gingerbread. And one can only hope that what it delivers is well worth all the hyped-up expectations.

更多相关文章

  1. 代码中设置drawableleft
  2. android 3.0 隐藏 系统标题栏
  3. Android开发中activity切换动画的实现
  4. Android(安卓)学习 笔记_05. 文件下载
  5. Android中直播视频技术探究之—摄像头Camera视频源数据采集解析
  6. 技术博客汇总
  7. android 2.3 wifi (一)
  8. AndRoid Notification的清空和修改
  9. Android中的Chronometer

随机推荐

  1. 将Android项目导入到Eclipse开发环境
  2. 如何快速学习一门新技术
  3. 解决 Android(安卓)Studio 乱码问题
  4. Android(安卓)Studio SVN配置忽略文件 1.
  5. Android学习笔记(一):基本概念
  6. android五子棋游戏源码
  7. Android应用权限及意义
  8. Android(安卓)Settings添加选项
  9. android studio 更新 Gradle错误解决方法
  10. 有关布局问题:TextView、EditText……(二)