The secret to accurate gesture control might lie in sonar technology

2 min read
11

Mouse, keyboard and touch screens are all great ways to interact with computers and devices. However when it comes to a User interface that I really want, it’s gesture control. That Minority Report like ability to control things with the movement of your hands. There is technology that is certainly moving us in the right direction like augmented reality and the Xbox Kinect which could reasonably accurately identify voice patterns and some hand movements, but this is still expensive technology and has floundered because it was too focused on gaming applications rather than on UI, which is where its real beauty lies. The other peripherals have their uses, but when it comes to the best ways to interact with technology, I think gesture control is the way to go for the future.

This type of technology needs to become a lot cheaper and more integrated into a wider variety of devices to start making any significant impact. It also needs to ensure that gestures are intuitive and makes sense to what a user is trying to do.

One of the interesting technologies to come out of the ongoing Mobile World Congress was that of gesture controls that use ultra sound technology to identify more accurate hand gestures. A company called Elliptic Labs has designed technology based on ultrasound, which allows the company to monitor movement in a 360-degree dome surrounding your smartphone. Essentially working like sonar — sending out inaudible frequencies from your phone’s speaker, and listening for their return with the microphone. The company’s algorithms time how long each wave takes to return, and then use these times to estimate distance.

It sounds a little bizarre, considering most companies are utilizing camera technology to try and develop gesture control. Sonar technology however adds a whole different dimension to it though and essentially can make a device more aware of its immediate surrounding and also more likely to accurately measure movement, compared to what a camera is able to do.

The result of this technology is seamless gesture control that uses hardware already available in every smartphone on the market. Elliptic doesn’t have any consumer products available at the moment, but was able to show off a handful of demo interactions at MWC, showing how this technology can be put to great use.

Picture a ringing phone that mutes instantly the moment it realizes you are moving your hands towards it. Or volume controls that instinctively pop up on screen before you even touch the volume button. Even better, the ability to zoom in and out of phone simply by using you hand. These might all sound like subtle improvements that are likely to make you look like the office fruitcake when walking around, but can actually make operating a device a whole lot easier. And more importantly, move us closer to a world of true gesture driven UI.

Want more stuff like this?

Get the best stories straight into your inbox daily!

Don’t worry we don’t spam

Craig Risi

A man of many talents, but no sense how to use them. I could be discovering the cure for aids or finding ways to achieve world peace, but I'd rather be watching movies and writing here instead.

Check Also

Now you can have complete privacy with your calls when walking around

How much is privacy of your calls worth to you? $200 and your self worth if this product i…