ACM Symposium on UIST: Microsoft Researchers Demonstrating New Touch Experiments Including 'OmniTouch, PocketTouch' and More!

Microsoft Research has announced that it has been working on two different types of touch interfaces: OmniTouch and PocketTouch. Also, at the ACM Symposium on User Interface Software and Technology (UIST) in Santa Barbara, California this week, Microsoft Researchers showing off these projects.OmniTouch gives users the ability to make an entire wall a touch surface, […]

Microsoft Research has announced that it has been working on two different types of touch interfaces: OmniTouch and PocketTouch. Also, at the ACM Symposium on User Interface Software and Technology (UIST) in Santa Barbara, California this week, Microsoft Researchers showing off these projects.

OmniTouch gives users the ability to make an entire wall a touch surface, while PocketTouch enables users to interact with smartphones inside a pocket or purse, a small surface area for touch. Both projects will be unveiled during UIST 2012, the Association for Computing Machinery's 24th Symposium on User Interface Software and Technology, being held Oct. 16-19 in Santa Barbara, Calif.

OmniTouch: Make Every Surface a Touch Screen - Microsoft's Hrvoje Benko said, "The surface area of one hand alone exceeds that of typical smart phones. Tables are an order of magnitude larger than a tablet computer. If we could appropriate these ad hoc surfaces in an on-demand way, we could deliver all of the benefits of mobility while expanding the user's interactive capability."

The prototype is supposed to be wearable by a person, using a combination of a camera created by PrimeSense with a laser-based pico projector. As you can see from the pictures above, the projector creates the image of the user interface which can be interacted with via the camera. While the prototype camera is pretty bulky to use, the web site claims the projector and camera combo could be made as small as a matchbox at some time in the future.

OmniTouch: Microsoft ResearchOmniTouch Prototype

PocketTouch: Through-Fabric Capacitive Touch Input--written by Saponas, Harrison, and Benko--describes a prototype that consists of a custom, multitouch capacitive sensor mounted on the back of a smartphone.

Microsoft says, "It uses the capacitive sensors to enable eyes-free multitouch input on the device through fabric, giving users the convenience of a rich set of gesture interactions, ranging from simple touch strokes to full alphanumeric text entry, without having to remove the device from a pocket or bag."

prototype of PocketTouch: Microsoft Research

Access Overlays, also debuting this week, is a set of three touchscreen add-ons for visually impaired users which make touchscreen navigation more useful. The add-ons, called Edge Projection, Neighborhood Browsing, and Touch-and-Speak, each approach touchscreen usage in a different way.

The edge projection overlay turns a 2D touch interface into an X-Y coordinate map. That places a frame around the outside of the screen with "edge proxies" that correspond with the on-screen targets' X-Y coordinates. Touching the edge proxy highlights the corresponding onscreen target and reads its name. Neighborhood browsing changes touch targets into bigger generalized areas outlined by white noise audio feedback which is filtered at different frequencies to differentiate touch "neighborhoods," and Touch-and-Speak is a marriage of touch gestures and spoken voice commands to manipulate an interface.

Portico is another interface augmenting touchscreens with multiple cameras that will be shown at UIST lets users both touch the screen as they normally would or use the space around the outside of their tablet. It even lets them use external objects on top of, or near the screen to interact with running applications.

Portico: Tangible Interaction on and around a Tablet:

Microsoft researchers also showing off at UIST this week, "Pause-and-Play, a way to link screencast tutorials with the applications they're displaying", "KinectFusion, a 3D object scanner based on a moving Kinect camera", and "Vermeer, an interaction model for a 360 degree "Viewable 3D" display."