Robot localization using AR tags

asked 2014-12-19 07:29:50 -0500

updated 2014-12-21 00:51:10 -0500

Hello,

I am trying to make a robot localization system using AR tags positioned on the ceiling. For that I am using the ar_track_package, a Turtlebot2 robot, and a Kinect pointing upwards. The idea is to get and publish the transformation map->odom, using the fixed transformation map->ar_marker, and the transformation camera->ar_marker, obtained from the ar_track_alvar node.

This is the tf tree of the system I am using.

This is a part of the code I'm using to get and publish the map->odom transformation. The full code is here.

const tf::Quaternion quaternion(qr_positions_vector_[i].pose.orientation.x, qr_positions_vector_[i].pose.orientation.y, qr_positions_vector_[i].pose.orientation.z, qr_positions_vector_[i].pose.orientation.w);

const tf::Vector3 position(qr_positions_vector_[i].pose.position.x, qr_positions_vector_[i].pose.position.y, qr_positions_vector_[i].pose.position.z);      

tf::Transform map_to_qr(quaternion, position);


// 1. get inverse of map->qr to obtain qr->map
tf::Stamped<tf::Pose> qr_to_map_stamped (map_to_qr.inverse(), qr_pose_time, qr_frame);

// 2. get transform odom->map,  transforming qr->map to odom frame
tf::Stamped<tf::Pose> odom_to_map;

try{
  bOK = true;

  tf::StampedTransform odom_to_qr;
  tf_listener_->lookupTransform(odom_frame, qr_frame, qr_pose_time, odom_to_qr);


  // get required transform 
  tf_listener_->transformPose( odom_frame, qr_to_map_stamped, odom_to_map);

}
catch (tf::TransformException ex){
  ROS_ERROR("%s",ex.what());
  //ros::Duration(1.0).sleep();
  bOK = false;
 }

if(bOK)
{
  // 3. get the inverse of odom->map to obtain map->odom 
  tf::Transform map_to_odom(tf::Quaternion(odom_to_map.getRotation()), tf::Point(odom_to_map.getOrigin()));    

  tf::StampedTransform map_to_odom_stamped(map_to_odom.inverse(), qr_pose_time, map_frame, odom_frame);   


  // 4. ignore z position
  map_to_odom_stamped.setOrigin(tf::Vector3(map_to_odom_stamped.getOrigin().x(),   map_to_odom_stamped.getOrigin().y(),0.0));

  // 5. set rotation in x,y to 0
  float yaw = getYaw(map_to_odom_stamped.getRotation());
  map_to_odom_stamped.setRotation(tf::createQuaternionFromYaw(yaw));

  // publish map->odom as a geometry_msgs/PoseWithCovarianceStamped
  qr_pose_msg_ = get_msg_from_tf(map_to_odom_stamped);
  ar_pose_pub_.publish(qr_pose_msg_);

  // publish transform map->odom on the tf tree
  if(publish_tf)
  {
    this->tf_broadcaster_->sendTransform(map_to_odom_stamped);
  }

} // end if(bOK)

The problem I have is that I only get an proximation of the localization of the robot. Sometimes works relatively well, but most of the time the map->odom transformation has a variation of 1 meter aprox, or even more. I would expect to have more accurate results because the camera->ar_marker (and the odom->camera) transformations are very accurate and they only have a variation of 1 or 2 cm.

Here is a short video of rviz with the problem I'm having. And here is a bag file with the topics used.

My guess is that something weird is happening with the tf times when requesting the transformation with transformPose(...), but I haven't found any solution yet to the problem.

Any help or hint would be appreciated.

Thanks.

edit retag flag offensive close merge delete