At the IFA Berlin trade show Elliptic Labs revealed their Touchless Gesture User Interface technology (video). It allowed a user to make hand movements (gestures), similar to Microsoft's Kinect technology; giving the user the ability to control their device by simple hand movements.
The company had been working on a prototype with a reference design tablet which has now evolved into a touchless dock prototype for the Apple iPad.
The technology behind the system sits in a special iPad dock and is ultrasound. This dock creates a "touchless zone" that covers approximately one foot in front and to the side of the iPad's screen. Elliptic Labs' CEO Stian Aldrin explained the vision for the technology on the iPad and its real world apps:
The idea is that you use touchless gestures to operate primary functions of a docked tablet in situations like when you've wet or greasy hands in the kitchen. In general tablets are made for being handheld. When it's docked you're often walking or standing further away, and then using a finger on the screen involves a change of modality. Rather than bending down, leaning forward or picking it up you can use larger movements a little bit further away to do things like volume up or next song without changing modality.
[tags]ces,consumer electronics show,gesture,touchless,ultrasonic,ultrasound,virtual reality,human,screen[/tags]