ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
1 | initial version |
Cheers for adding your config and sensor data. Very helpful.
Right off the bat, you stated yourself that
Pose from the visual SLAM. This is W.R.T. the starting position. Also, this is using the classic "camera frame" with Z facing forward.
However, you have the frame_id
of that pose data set to odom
, and you also told the EKF that your world_frame
was odom
. ROS assumes a right-handed coordinate frame with +X forward, +Y left, and +Z up. While it's common for visual odometry sources to produce data in the camera frame, you need to account for that by (a) changing the frame_id
in the visual odometry to be something like camera
(you can name it whatever you want) and then (b) publishing a static transform (using, e.g., a static_transform_publisher) from odom
to camera
that captures the rotation from the coordinate frame I described to your camera frame.
Secondly, you said you want to fuse the absolute orientation from your IMU with the position from your visual SLAM, but that's not going to work. Imagine that you start your robot facing 130 degrees in the ENU frame, so your initial pose in the world frame is (0, 0) with a heading of 90 degrees. Then you push the robot forward one meter. The visual odometry says your (x, y)
position is now (1, 0)
, but your IMU will say that your heading is still 90. So it will look like your robot moved laterally in the EKF output.
If having a heading that is accurate w.r.t. the ENU frame is critical, then you need to change your visual slam input. You should either make it so that the visual slam node publishes velocity data and fuse that, or you should turn on differential
mode for the visual slam data (all of it). In other words, fix the frame_id
on your visual odometry, then do this:
imu0: "imu/data"
imu0_differential: false
imu0_relative: false
imu0_queue_size: 100
imu0_remove_gravitational_acceleration: true
imu0_config: [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]
pose0: "odometry/visual"
pose0_differential: true
pose0_relative: false
pose0_queue_size: 10
pose0_config: [true, true, true,
true, true, true,
false, false, false,
false, false, false,
false, false, false]
2 | No.2 Revision |
Cheers for adding your config and sensor data. Very helpful.
Right off the bat, you stated yourself that
Pose from the visual SLAM. This is W.R.T. the starting position. Also, this is using the classic "camera frame" with Z facing forward.
However, you have the frame_id
of that pose data set to odom
, and you also told the EKF that your world_frame
was odom
. ROS assumes a right-handed coordinate frame with +X forward, +Y left, and +Z up. While it's common for visual odometry sources to produce data in the camera frame, you need to account for that by (a) changing the frame_id
in the visual odometry to be something like camera
(you can name it whatever you want) and then (b) publishing a static transform (using, e.g., a static_transform_publisher) from odom
to camera
that captures the rotation from the coordinate frame I described to your camera frame.
Secondly, you said you want to fuse the absolute orientation from your IMU with the position from your visual SLAM, but that's not going to work. Imagine that you start your robot facing 130 90 degrees in the ENU frame, so your initial pose in the world frame is (0, 0) with a heading of 90 degrees. Then you push the robot forward one meter. The visual odometry says your (x, y)
position is now (1, 0)
, but your IMU will say that your heading is still 90. So it will look like your robot moved laterally in the EKF output.
If having a heading that is accurate w.r.t. the ENU frame is critical, then you need to change your visual slam input. You should either make it so that the visual slam node publishes velocity data and fuse that, or you should turn on differential
mode for the visual slam data (all of it). In other words, fix the frame_id
on your visual odometry, then do this:
imu0: "imu/data"
imu0_differential: false
imu0_relative: false
imu0_queue_size: 100
imu0_remove_gravitational_acceleration: true
imu0_config: [false, false, false,
true, true, true,
false, false, false,
true, true, true,
true, true, true]
pose0: "odometry/visual"
pose0_differential: true
pose0_relative: false
pose0_queue_size: 10
pose0_config: [true, true, true,
true, true, true,
false, false, false,
false, false, false,
false, false, false]