Microsof Research’s “Augmented Projectors”, in essence it seems to combine the results of other experimental projects – Kinect-based environment modelling, projection-based gesture interaction and spatial-aware projections into one super project.
At the center of this project is a handheld projector which provides a small window for the user into a digitally augmented environment around them. Used like a torch, users can see and interact with virtual objects in 3D space. In one example, a user was able to kick and move blocks around them which bounced around the environment realistically. In another, a user could write in the air and see the virtual text cast realistic shadows onto the wall.
“This paper seeks to better understand the interactive possibilities this type of awareness affords mobile projector interaction. Previous work in this area has predominantly focused on infrastructure-based spatial-aware projection and interaction. Handheld projector systems have the potential to enable users to dynamically augment environments with digital graphics. We explore new parts of the design space for interacting using handheld projection in indoor spaces, in particular systems that are more “aware” of the environment in which they are used. This is defined broadly as either spatial awareness, where the projector system has a sense of its location and/or orientation; or geometry awareness, where the system has a sense of the geometry of the world around it, which can encompass the user?s hands and body as well as objects, furniture and walls. Awareness like this can be enabled through the use of sensors built into the handheld device, through infrastructure-based sensing, or a combination of the two,” readed the project description.
“This paper seeks to better understand the interactive possibilities this type of awareness affords mobile projector interaction. Previous work in this area has predominantly focused on infrastructure-based spatial-aware projection and interaction.”
Video demo is embedded below: