ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

robot_localization problems

asked 2018-09-28 21:02:13 -0600

JakeSheng gravatar image

updated 2018-09-29 15:37:55 -0600

I'm now using robot_localization package on Ubuntu 16.04. I want to fuse the data of pose (given by orb_slam2), imu and gps signal collected by DJI Matrice 100. But the result looks strange. I'm not sure whether my configuration is right or not. The gps odometry (the green arrows in the picture) is not showing the ground truth path (a rectangular path on a fixed height), and the odometry/filtered_map (the red arrows) fluctuates badly along all axis when gps is fused and its path doesn't make sense. I will put my configuration file here and please help me check if there are errors. Thanks!

I'm sorry that I can't upload files or images because I don't have enough points. Please visit my google drive. google drive The bag file is also in this folder. The pose information given by orb_slam is also recorded and can be directly put into the ekf node.

The tf between base link and imu or gps is set to [0 0 0 0 0 0] because for now I'm not sure how imu and gps are mounted on the drone. This might be a problem.

/orb_slam/posewithcov is the pose information given by orb_slam2.

The imu reads 0 when the drone is facing northeast, so I set yaw_offset to 0.7853. I'm not sure if this is right because the heading of drone may not be the heading of imu. <launch> <arg name="output_final_position" default="false"/> <arg name="output_location" default="test.txt"/>

<param name="/use_sim_time" value="true" />

<node pkg="tf2_ros" type="static_transform_publisher" name="imu_link" args="0 0 0 0 0 0 base_link body_FLU" />
<node pkg="tf2_ros" type="static_transform_publisher" name="bl_gps" args="0 0 0 0 0 0 base_link gps" />

<node name="test_ekf_localization_node_ekf" pkg="robot_localization" type="ekf_localization_node" clear_params="true">

  <param name="frequency" value="30"/>
  <param name="sensor_timeout" value="0.1"/>
  <param name="two_d_mode" value="false"/>

  <param name="map_frame" value="map"/>
  <param name="odom_frame" value="odom"/>
  <param name="base_link_frame" value="base_link"/>
  <param name="world_frame" value="odom"/>

  <param name="transform_time_offset" value="0.0"/>
  <param name="transform_timeout" value="0.0"/>

  <param name="pose0" value="/orb_slam/posewithcov"/>
  <param name="imu0" value="/dji_sdk/imu"/>

  <rosparam param="pose0_config">[true , true , true ,
                                  false, false, false,
                                  false, false, false,
                                  false, false, false,
                                  false, false, false]</rosparam>

  <rosparam param="imu0_config">[false, false, false,
                                 true , true , true ,
                                 false, false, false,
                                 true , true , true ,
                                 true , true , true ]</rosparam>

  <param name="pose0_differential" value="false"/>
  <param name="imu0_differential" value="false"/>

  <param name="pose0_relative" value="false"/>
  <param name="imu0_relative" value="false"/>

  <param name="imu0_remove_gravitational_acceleration" value="false"/>

  <param name="print_diagnostics" value="true"/>

  <param name="pose0_queue_size" value="10"/>
  <param name="imu0_queue_size" value="10"/>

  <param name="debug"           value="false"/>
  <param name="debug_out_file"  value="debug_ekf_localization.txt"/>

  <rosparam param="process_noise_covariance">[0.05, 0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                                              0,    0.05, 0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                                              0,    0,    0.06, 0,    0,    0,    0,     0,     0,    0,    0,    0 ...
(more)
edit retag flag offensive close merge delete

Comments

@JakeSheng, did you manage to solve this? I'm doing something similar but using only the IMU and GPS on the Matrice 100. I find that my odometry filtered estimates just jumps around

marinePhD gravatar image marinePhD  ( 2018-11-20 10:36:20 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2018-11-09 06:14:56 -0600

Tom Moore gravatar image

There's quite a bit going on here!

If you can update your question and add some sample messages from each sensor source, I'd appreciate it.

I'd be very careful about fusing multiple sources of absolute pose information. I don't know enough about the SLAM package in question to comment on how it behaves, but if the package has any divergence or has a poor loop closure (is it running live SLAM, or just providing a pose estimate in your map?), then the SLAM position estimate and the GPS position estimate are going to differ wildly, and it'll just jump back and forth between those state estimates.

The same goes for your IMU data. You are fusing absolute orientation data from your IMU, but there's no guarantee that it will match your SLAM positions. For example, if you start your robot in a pose where the IMU reads pi/4 radians, and then drive straight forward one meter, your SLAM package, if not using your IMU absolute orientation, is going to probably read an XY position of (1, 0), but that disagrees with the orientation of the IMU, which would suggest that we should be at (0.707, 0.707) after driving forward one meter.

In any case, if you are fusing multiple absolute pose sources, you need to make sure they are in the same coordinate frame, or that they have a transform defined that will make them so.

My suggestion to you is to start with just the "tier 1" (odom frame) EKF. Get that looking the way you want, then move on to the second tier (map frame) EKF. If I were you, I'd probably fuse my full 3D pose from the SLAM package in the odom-frame EKF, and maybe just fuse the angular velocities from the IMU. If you lack a velocity measurement source, I'd stop fusing the linear acceleration data from the IMU, too.

edit flag offensive delete link more

Comments

Hi Tom, thank you very much for the answer and suggestion! Recently, I decided to write a simple sensor fusion node for our drone, and now I have some questions.

  1. My IMU topic type is sensor_msgs/Imu. If I want to get the x,y,z velocity, do I just need to integrate the x,y,z acceleration?
JakeSheng gravatar image JakeSheng  ( 2018-11-14 00:56:59 -0600 )edit
  1. The EKF updates predictions using one input and one observation right? I decide to take imu as the input. What about the SLAM position and GPS position? Do I need to make them into one observation?
JakeSheng gravatar image JakeSheng  ( 2018-11-14 01:01:20 -0600 )edit

In robot_localization,the EKF predict function use the system model to calculate the predicted state_X,and all the sensors' message is being take to the measurements.It is the same as robot_pose_ekf in some kind.

nlwmode gravatar image nlwmode  ( 2019-12-12 20:02:00 -0600 )edit

Question Tools

1 follower

Stats

Asked: 2018-09-28 21:02:13 -0600

Seen: 885 times

Last updated: Nov 09 '18