Tuesday, March 19, 2013

2-8 Basic Skeletal Following without SLAM

In parallel with the design and construction of the rocker-bogie prototype, I've been investigating navigation and mapping techniques using visual SLAM.   SLAM (Simultaneous Localization and Mapping) is a technique for using landmarks and probability to:
  1. discover landmarks and create a map of the robot's environment
  2. use the map and current sensor readings combined with probability to estimate the position and orientation of the robot on the map. 

As a precursor to training porterbot to map out and navigate around household environments, I wanted to use a Microsoft Kinect to experiment with visual SLAM.  The Kinect has an API for getting vision and skeleton data, so my goal was to take this data and try to perform exploration and mapping on a single level of a house.  I looked around for a reasonably affordable platform based off the Kinect and found TurtleBot, a platform that uses the Kinect for vision, an iRobot Create for movement, and a user's laptop for processing and control.
Turtlebot ready to roam

I already had a Kinect, so I ordered the Turtlebot platform parts and purchased an iRobot Create separately.  The Turtlebot chassis parts are very straightforward to put together; it took me less than an hour to put it all together.  It comes with a custom build of ROS (Robot Operating System) to get jump-started.  Instead of using ROS, I decided to use Microsoft Robotics Developer Studio because it has a very detailed API to make full use of the Kinect.  It also has built-in support for communicating with the Create, so I thought it would go very smoothly.  I had some difficulty working with Robotics Studio, but eventually managed to get a Service talking to both the Create and Kinect simultaneously. 



For my first goal, I wanted to get skeleton data from the Kinect, find the distance to the hip-center joint of the person data, and then send commands to the Create to rotate so the person was centered.  I'm won't post code at this point since the Robotics Studio code is largely throw-away so I can understand the process.  I used my kids (Ethan, Zoe, and Phoebe) as subjects for the initial experiment, which you can see from the above video worked pretty well.  The USB wires connecting the ICreate and Kinect to my laptop were pretty annoying, but I'll figure out some way to secure my laptop for further testing.  On to SLAM mapping!