Remote controls are breaking new ground for Human-Robotic Interaction (HRI) by leveraging Natural User Interface (NUI) allowing users to carry out relatively natural motions, movements or gestures which in return control computer applications, manipulate on-screen content or in this case control a robot.
Gershon’s demo of ‘Motion Tracking Robot Controller’ is a great example on how NUI can be used to control a robot, but it’s just the tip of the iceberg.
How it works?
“EDDIE, the reference platform used for this demo comes with an 8-core Propeller microcontroller to directly control two 12v motors. These motors can be control remotely or Eddie can roam autonomously by leveraging several sensors around the robot and see in 3D using Microsoft Kinect,” revealed Microsoft Robotics team blog.
“Gershon Parent, a developer with the Microsoft Robotics group, has added a new twist on how EDDIE can be wirelessly controlled which he’s dubbed the “Motion Tracking Robot Controller”. By leveraging skeletal tracking through a Kinect sensor Gershon can control the two 12v motors through arm gestures navigating EDDIE through his environment.
When standing in front of the Kinect sensor, Gershon’s right hand controls the right motor and his left hand controls the left motor; it’s kind of like a tank driver. When he raises both arms simultaneously the robot will move forward in a straight line and the higher he raises his hands the faster the robot will go. To put the robot in reverse he simply lowers both of his arms at the same time. To turn the robot all he has to do is tilt his hands one over the other from side to side to give the robot the desired degree of turn,” added Microsoft Robotics team.
Here is the video: