ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Robot_localization for fusing model pose and SLAM odometry

asked 2018-04-04 09:03:32 -0500

JSandu gravatar image

updated 2018-04-05 08:45:52 -0500

Hi there,

I'm using the robot_localization package for fusing several data inputs. I have a theorical model that estimates an ideal odometry for my robot only based on my commands and the robot model. On the other hand, I have a SLAM estimation system as input for the EKF too.

My robot is a quadcopter and it has a yaw drift. I mean, its yaw value changes slowly over the time; and that is not registered by the theoretical model estimation. So model a slam orientation diverge very fast (in a minute or so). With enough time, there would be even 90º between their orientations, so it would be like one of them were not accomplishing the rep 103..

The question is: What would be the best way to fuse those estimations? Should I associate the "model yaw" to the estimated pose or is the EFK robust against orientation differences? Maybe I should do some reset of the theoretical pose estimated?

Thank you in advance, Juan.

edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted

answered 2018-05-22 04:08:12 -0500

Tom Moore gravatar image

Please see this section in the wiki. If I were you, I'd fuse the SLAM orientation as an absolute pose variable, and fuse the quadcopter yaw with differential set to true. That, or I'd make your quadcopter odometry message also produce velocities, and just fuse those instead.

edit flag offensive delete link more


Thanks you so much Tom

JSandu gravatar image JSandu  ( 2018-10-01 13:42:29 -0500 )edit

Question Tools

1 follower


Asked: 2018-04-04 09:03:32 -0500

Seen: 535 times

Last updated: May 22 '18