Kinect Hack: Occlusion Detection and Virtual Object Manipulation in AR; World in 3D; Ubi-Interactive Turns Wall into a Touchscreen

In the new Kinect and Kinect for Windows SDK augmented reality project "Occlusion detection and virtual object manipulation in augmented reality with the Microsoft Kinect," the creators used the Microsoft Kinect sensor to allow one to virtually touch and manipulate the object displayed. "In order to do this, we use the skeletal tracking of the […]

In the new Kinect and Kinect for Windows SDK augmented reality project "Occlusion detection and virtual object manipulation in augmented reality with the Microsoft Kinect," the creators used the Microsoft Kinect sensor to allow one to virtually touch and manipulate the object displayed.

"In order to do this, we use the skeletal tracking of the sensor to allow the user to rotate and resize the object with his hands. While doing so, the user hands will inevitably get between the virtual object and the camera. To prevent the virtual object from being displayed wrongly over the hand, the depth map of the Kinect is used to find any obstruction and block rendering of pixels under occlusion. This is realized at the GPU level, using modified Shaders," explains.

The AR system uses the Unity3D game engine to display the 3D models with a custom plugin created at the CIMMI research center to enable the AR.

"The plugin computes in real time the pose of the Kinect using the color image while the depth map allow occlusion detection and hands tracking. The plugin was developed with the OpenCV library which allows easy analysis of the images."

This work was realized at the CIMMI research center (cimmi.qc.ca) by two interns, Alexis Legare-Julien and Renaud Simard, under the supervision of Jean-Nicolas Ouellet, Ph.D.Eng.



Ubi-Interactive one of the 11 startups participating in the Kinect Accelerator program in Seattle housed at Microsoft's Westlake offices in South Lake Union uses Kinect to make any surface.

"We can turn any surface into a 3D touchscreen," explained Anup Chathoth, one third of Munich-based startup Ubi Interactive. Such claims typically conjure up images of floating Minority Report-style touchscreens made from curved glass, but that's exactly what this three-person team has developed.

Ubi's system uses a Microsoft Kinect sensor to turn a regular projector into a multi-touch PC projection system, where regular PowerPoints, web pages, even games no longer require clickers or wireless mice to be navigated. By using the motion-tracking and depth-perception cameras in the Kinect, Ubi is able to detect where a user is pointing, swiping and tapping on a surface and interpret these gestures as if they were being performed on a giant touchscreen or interactive whiteboard.



Skanect from the French duo of Nicolas Burrus and Nicolas Tisserand is a "low-cost 3D scanner based that used Kinect for Windows".