Google's Project Soli to bring Radar-Based Gesture Recognition to wearables

Googles Project Soli to bring Radar-Based Gesture Recognition to wearables
x
Highlights

Google\'s Project Soli to bring Radar-Based Gesture Recognition to wearables. One of the big problems with wearable devices right now is inputs - there\'s no simple way to control these devices.

One of the big problems with wearable devices right now is inputs - there's no simple way to control these devices. At Google I/O 2015 the company unveiled Project Soli - a radar-based wearable - that can be used to control all kinds of devices. Developed by Googles Advanced Technology and Projects (ATAP) team, Project Soli can be incorporated into a range of different devices.

It's a gesture based system that can track small movements like waving your fingers - it could be an easy way to control wearables, or even give you have hands-off control of your phone. It could also allow you to enter text on a smartwatch without restricting you to the small screen. Essentially, Project Soli is a radar system that's small enough to fit into a smartwatch. It can pick up on movements in real time, and the movements you make alter its signal. It can detect swipes, or making a fist, or crossing fingers.

Using your hand to interact with a device is typically much more accurate than working with voice recognition - and according to this video, Project Soli is sensitive enough to track "micro-motions" - unlike something like Microsoft's Kinect technology, which is not as precise. Unlike other systems that use cameras, Soli uses radar which has much higher sensitivity, so it could be used for gestures like pressing a button, moving a slider, or turning a knob.

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS