ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Robot localization fusion

asked 2020-06-19 12:33:27 -0500

Aquas gravatar image

updated 2020-06-22 02:17:56 -0500

Hello!!

I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source that provides the difference between the previous position and the current one by giving a traslation vector and a quaternion.

IMU data is in baselink frame and the other source is in the reference frame (I selected odom). I thought of using a PoseWithCovarianceStamped for the second source, but I'm not sure if robot localization works with variations in pose, so the questions are:

Is there any way I can fuse them with robot_localization? As the second source does not give a complete pose but a change in pose do I have to modify this data? Should I add all the previous estimations plus the current one to get a the "global" pose (instead of the variation) and send that one to EKF along with the IMU?

Thank you.

EDIT:

I was trying to fuse just my estimation to see how it performed, this is the yaml code:

frequency: 5
sensor_timeout: 2

two_d_mode: false

transform_time_offset: 0.0
transform_timeout: 0.0

print_diagnostics: false
debug: true
debug_out_file: /home/javier/pruebas/MasPruebas/Data/file.txt
publish_tf: true
publish_acceleration: false

map_frame: map              # Defaults to "map" if unspecified
odom_frame: odom            # Defaults to "odom" if unspecified
base_link_frame: erlecopter/imu_link  # Defaults to "base_link" if unspecified
world_frame: odom           # Defaults to the value of odom_frame if unspecified

odom0: /Camera_estimation
odom0_config:  [false,  false,  false,
                false,  false,  false,
                true,  true,   true,
                true,  true,  true,
                false,  false,  false]
odom0_differential: false;
odom0_relative: true

odom0_queue_size: 2


#imu0: /erlecopter/imu
#imu0_config:  [false, false, false,
#               false, false, false,
#               false, false, false,
#               true,  true,  true,
#               true, true, true]
#imu0_relative: true
#imu0_queue_size: 100


process_noise_covariance:  [diag(small values)]


initial_estimate_covariance: [diag(1e-6)]

And the main code is something like this:

void ImageCallback()
{
   /*Estimate displacements*/

      double interval=(stamp-Prev_time).toSec();

      nav_msgs::Odometry pub_msg;
      pub_msg.header.seq=seq;
      pub_msg.header.stamp=stamp;
      pub_msg.header.frame_id="odom";

      pub_msg.child_frame_id="odom";

      pub_msg.twist.twist.linear.x=tras.at<double>(0)/interval;
      pub_msg.twist.twist.linear.y=tras.at<double>(1)/interval;
      pub_msg.twist.twist.linear.z=tras.at<double>(2)/interval;

      pub_msg.twist.twist.angular.x=(angles.roll-Prev_angles.roll)/interval;
      pub_msg.twist.twist.angular.y=(angles.pitch-Prev_angles.pitch)/interval;
      pub_msg.twist.twist.angular.z=(angles.yaw-Prev_angles.yaw)/interval;

      pub_msg.twist.covariance=Covariance_default;

      Pub_estimation.publish(pub_msg);

      Prev_time=stamp;

}

int main(int argc, char **argv)
{
  ros::init(argc, argv, "image2angles");
  ros::NodeHandle nh;
  image_transport::ImageTransport it(nh);
  image_transport::Subscriber sub_img = it.subscribe("/erlecopter/bottom/image_raw", 1, imageCallback);
  ros::Subscriber sub_h = nh.subscribe("/erlecopter/imu", 1, imuCallback);
  ros::Subscriber sub_h2 = nh.subscribe("/odometry/filtered", 1, ScaleCallback);
  Pub_estimation=nh.advertise<nav_msgs::Odometry>("Camera_estimation",1);
  ros::spin();
}

I've checked the Camera_estimation topic and the messages are being published, but odometry/filtered isn't publishing anything.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2020-06-19 12:40:17 -0500

parzival gravatar image

Assuming your second source is giving he change in pose, which is velocity, you can use the odom option in the robot_localization package along with your IMU source.

This should work:

odom0_config: [false, false, false, 
              false, false, false,
              true, true, false,
              false, false, true,
              false, false, false]

odom0_differential: false
odom0_queue_size: 2
odom0_relative: true

imu0: /imu/data

imu0_config: [false, false, false,
              false, false, true,
              false, false, false,
              false, false, true,
              true, false, false]

imu0_differential: false
imu0_queue_size: 5
imu0_relative: true
edit flag offensive delete link more

Comments

Then, I should divide the traslations/rotations by the time between the messurements and save those values in the TwistWithCovariance field of the odom message?

Aquas gravatar image Aquas  ( 2020-06-20 04:06:13 -0500 )edit
1

"another source that provides the difference between the previous position and the current one"

If you're getting the difference already, then you can simply divide it by the time interval to gain the velocities.

parzival gravatar image parzival  ( 2020-06-20 08:08:04 -0500 )edit

I have tried it and looking at the debugging file it says that it cannot transform into base_link frame. The velocities are given in odom frame and I'd like to know where the base_link (attached to my UAV) is with respect to odom.

Aquas gravatar image Aquas  ( 2020-06-21 09:36:00 -0500 )edit

Share your config file for robot_localization and the code which is publishing the odom source

parzival gravatar image parzival  ( 2020-06-21 09:38:22 -0500 )edit

I've added the relevant parts of the code to the main question

Aquas gravatar image Aquas  ( 2020-06-22 02:18:45 -0500 )edit

Have you tried changing the odom config to as I suggested? What was the result?

parzival gravatar image parzival  ( 2020-06-22 02:29:45 -0500 )edit

If I use yours (taking into account that mine is not a 2D movement) it does work, but it returns an output whose displacement is completely wrong. I get ~100m when it should be around 1m (probably because accelerometer meassurement is not the best thing to fuse). But I think it is not fusing the other meassurement (not IMU). If it were, it should also work when no IMU is available.

Aquas gravatar image Aquas  ( 2020-06-22 09:24:53 -0500 )edit

My config was for 2D or planar movement, I don't have experience with 3D. But I guess you can add true to similar vectors based on the above and get it working. As for the wrong displacement, if you use my config without any changes, that can lead to be an issue. Another cause might be the odometry calculation itself isn't right. Have you checked just the odometry data and verified how accurate it is? You can check http://wiki.ros.org/navigation/Tutori... for verifying.

parzival gravatar image parzival  ( 2020-06-26 00:28:49 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2020-06-19 12:33:27 -0500

Seen: 416 times

Last updated: Jun 22 '20