Ari Grobman, Lumus' director of business development, demonstrates gesture control to run his company's smartglasses.
(Credit: Stephen Shankland/CNET)
BARCELONA, Spain — One of the difficulties with wearable computing is that it can be hard to control devices that don’t have a handy keyboard or touch screen attached. And that’s how gesture control company EyeSight Mobile won a place in Lumus’ smartglasses.
With the technology a person can hold out a finger to tap on icons or swipe away notifications in the virtual view Lumus glasses present. Later, EyeSight plans to add the ability to drag items around the display, too.
“You can actually touch the icons in the air with your fingers,” EyeSight Chief Executive Gideon Shmuel told CNET.
The companies revealed the partnership here at the Mobile World Congress. EyeSight expects its gesture recognition software will ship with the glasses, Shmuel said.
EyeSight CEO Gideon Shmuel at Mobile World Congress 2014.
(Credit: Stephen Shankland/CNET)
Related stories[Read more]
Related Links:
Gesture recognition will be a slam-dunk, startup head says (Q&A)
Fujitsu tap-and-wave glove works where touchscreens don’t
The future of wearables: 8 predictions from tech leaders
Say the magic words to wake the next generation of voice-controlled devices
Can a simple app really give you superhero sight?