ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

The package robot_localization is just a kalman filter that fuses odometry data to provide a filtered estimate of the fusion of sensors. The package can currently accept messages of type:

from a plethora of sources. Where those sources come from is up to you. The package does have support for 3D localization and 2D localisation with two_d_mode set to true. Please see the documentation as to how to use this. Specifically referring to your picture,

  • Odometry would be supplied from a tachometer relating the sensor to kinematic constraints
  • IMU would come from an Inertial Measurement Unit of which the manufacturer most likely provides a driver
  • GPS provides a pose in a global reference frame which is determined by the navsat_transform_node also a part of the package.
  • Camera using some visual odometry package such as rtabmap or viso
  • 3D sensing appears to represent an Xbox Kinect in this image of which you could perform the above visual odometry or even Iterative Closest Point (ICP) odometry by interpreting the depth sensing as a point cloud

Based on the sources you supply to the state estimation nodes, it will provide a 3D (or 2D) estimate of your position. This package is often used in collaboration with gmapping for Simaltaneous Localisation And Mapping (SLAM).

Thanks, Grant.