I am a fan of motion-detection, or measuring the movement in space over time, as a trigger for computing, and as new method of interaction for mobile applications – some uses include gestures.

There are two approaches to detect motion. The first one is using Gyroscopes to measure changes in orientation or “gyroscopic inertia”.



(Image source: Wikipedia)

The above is the same principle used in aerospace vehicles such as airplanes and space rockets to help guide the vehicle. This also is used in the current Gravity Probe B experiment that is testing Einstein’s theory of the universe.

From the perspective of mobility, companies such as InvenSense are creating integrated (circuit)-based gyroscopes. Last year InvenSense announced their Integrated Dual-Axis Gyroscope that measures orientation in X and Y axis. But obviously, to take advantage of this technology, these chipsets and the related software and APIs must be resident on the handset, which is not the case today — but I’m counting that such IC-based gyroscopes will be prevalent on handsets at some point; thanks to the iPhone, it is easy to see and explain the power of using such motion-detection technology for things such as automatic image rotation, and other uses such as gestures.

The second approach is to use computer vision and image processing to detect optical flow (i.e. motion), using the camera on the mobile handset. Detecting motion using computer vision and image processing is nothing new and there are lots of research papers on this topic, and these algorithms are pretty advanced, not to mention awesome. A company called GestureTek has a solution called EyeMobile that does just that – it uses computer vision and image processing to detect motion… they call it “Shake, Rock and Roll”. They support BREW, Symbian and Linux Platforms. Because of the platform hooks required, they have no solution for Java ME. I’ve played with the technology, and it is pretty remarkable. I will say though, that it does have its limitations, such as it won’t work that great in low-light/dark places. See their video here.

I personally prefer the gyroscopic approach, as it is a cheaper (from the perspective of computing requirements), and it is a dedicated function. The gyroscope-based solution is or should be GestureTek’s EyeMobile biggest concern. In the meantime, optical-based motion detection rules.

Related to this see Gyros on Mobile Handsets – A Very Powerful Tool.

ceo