Search this blog...

Sensors on Google Glass

Recently Google has shared the Linux Kernel source for the firmware running on the Google Glass. Having grown tired of watching online reviews about the monotonous "Glass this, glass that" voice commands, i was curious about the support for other forms of input. (read gestures). A quick peek into the innards of the kernel source revealed quite a lot...

FACT1. Google Glass runs on Texas Instruments OMAP4430.

Nothing revolutionary. A major win for TI (FWIW), considering that it has nonchalantly quit the mobile-SoC market citing a low RoI. This was already known as some guy who actually had the Google Glass, ran adb and found out.

FACT2. Google Glass has a built-in Accel, Gyro & Compass. 

Invensense MPU6050 = 3axis gyro + 3 axis accel.
Asahi Kasei AKM8975 = 3axis geomagnetic sensor(compass).
Combining facts 1 and 2 we can see that the device spec for SoC and sensors perfectly matches the popular Samsung Galaxy S2 (variant I9100G).

Rather than having independent ICs for both, the Google Glass uses MPU9150. Invensense MPU9150 is a single SiP which contains MPU6050 and AK8975 ICs within. This is fully hardware-compatible with existing MPU6050 board designs with the additional benefit of... (as Invensense quotes on its website) "...providing a simple upgrade path and making it easy to fit on space constrained boards." Perfect for Google Glass.
Refer: arch/arm/mach-omap2/board-notle.c line:1710

FACT3. Google Glass has a "glasshub"

I stumbled upon this by accident as i was searching for the sensor drivers. The glasshub appears to be a external micro-controller that communicates with OMAP4430 over I2C. This is the hardware that supports the "wink" command. Strangely enough, it supports upto 20winks! Looks like someone didn't learn their lesson with triple and quadruple mouse-clicks designs. On the other hand, this will be most essential when someone attempts to write a Google Glass app to detect seizures. Forward thinking as always, Google.

The glasshub also reports IR data and proximity (not sure about the underlying hardware though).


FACT4. Google Glass has a "Proximity" sensor.

Not to be confused with the "glasshub" there is another independent module, the LiteON LTR-506ALS. A very good sign, this IC is extremely customisable when it comes thresholds/IR-pulse-freq/pulse-count/poll-rate. Maybe, just maybe, we could hack the whole setup into a rudimentary IR remote. While being used primarily for ambient light sensing, it also supports proximity sensing. This means that we can have the Google Glass detecting our finger/hand swipes in front of our face. Quite the most exciting tech of the lot as it will provide the illusion of being able to actually handle the projected images.
Refer: arch/arm/mach-omap2/board-notle.c line:1727

Overall quite a good amount of
"sensory" tech inside for me to play with.
Me so excited. ;-) ;-) wink wink
Hey Google, Can i haz a Google Glass?


  1. Think the cute cat will work? :)

  2. I like the cat also! Please excuse my lack of knowledge of this technology. So the Google glass can pick up orientation, correct? How accurate is the orientation? Example: Lets say I walked into a large castle and Google glass already had a 3D model of the castle programmed into it. At the entrance of the castle there would be an X on the ground, so I would stand on the X and let Google glass know that this was my 0,0,0 co-ordinates. As I walked through the castle how accurate would Google glass know what I am looking at within the castle? If I tilted my head half an inch would Google glass know exactly what I am looking at within the 3D model? Thank you, Charles

    1. A single accelerometer is used to determine the orientation of the glass(Just like in the case of current-gen Android smartphones).

      The orientation can be determined accurate to a few degrees.

      Expected an error of -/+10 degrees between individual glass devices and due to "mounting" errors :P .

      Higher accuracy can be obtained by calibrating the value prior to starting the orientation tracking exercise. For example, ask the user to look straight ahead and use the accelerometer values at that precise moment to determine the delta(if any) between the expected and actual readings of the accelerometer. This delta can be used to compensate each of the readings captured next to track the orientation.

      This is similar to the calibration carried out in motion-based games on Android phones/tablets, where the user is asked to hold the device flat and tap when ready.

      Working Principle:
      The accelerometer hardware, when stationary and placed in the flat position, reads 9.8m/s2 vertically downwards(-ve Z axis). This is the earth's gravitational field. Now if the glass is oriented in any different direction, this 9.8m/s2 will be observed by the accelerometer hardware as different components along it 3 axes. Finding the orientation is now just a simple matter of finding the ratios of the each of the observed readings with respect to the known magnitude of 9.8 and apply inverse trigonometric functions. This can be experimented upon on current-gen Android smartphones as well.

  3. Hi Chinmay, do you know what the margin of error is after calibration? Do you know to within what fraction of a degree the accuracy of the 3 axis gyroscope? Can the 3 axis gyroscope register pitch role and yaw? If so does the yaw work as accurately as the pitch and role?

    Thank you for your help,

    1. Factors like accuracy, sensitivity and response-time depend upon the accel/gyro hardware in use. In case of the Google-Glass, the Invensense MPU-9150 product-sheet should help you understand the same. (Check out page 7 and 11 for details of the accuracy).

      AFAIK the accuracy along all axes of the gyroscope is the same. And the accuracy on a properly calibrated device is easily within range of 10degrees i.e. the actual orientation is within -/+5degrees of the readings using the sensor hardware.

    2. PS: The above accuracy numbers are from what i have observed on Android smartphones running on same spec-ed hardware. As the glass runs of the Android stack and uses the same hardware underneath, the results should be the same.

  4. does the google glass use a magnet or have a magnet it them?