ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A
Ask Your Question

Robot_localization for fusing model pose and SLAM odometry

asked 2018-04-04 09:03:32 -0600

JSandu gravatar image

updated 2018-04-05 08:45:52 -0600

Hi there,

I'm using the robot_localization package for fusing several data inputs. I have a theorical model that estimates an ideal odometry for my robot only based on my commands and the robot model. On the other hand, I have a SLAM estimation system as input for the EKF too.

My robot is a quadcopter and it has a yaw drift. I mean, its yaw value changes slowly over the time; and that is not registered by the theoretical model estimation. So model a slam orientation diverge very fast (in a minute or so). With enough time, there would be even 90º between their orientations, so it would be like one of them were not accomplishing the rep 103..

The question is: What would be the best way to fuse those estimations? Should I associate the "model yaw" to the estimated pose or is the EFK robust against orientation differences? Maybe I should do some reset of the theoretical pose estimated?

Thank you in advance, Juan.

edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted

answered 2018-05-22 04:08:12 -0600

Tom Moore gravatar image

Please see this section in the wiki. If I were you, I'd fuse the SLAM orientation as an absolute pose variable, and fuse the quadcopter yaw with differential set to true. That, or I'd make your quadcopter odometry message also produce velocities, and just fuse those instead.

edit flag offensive delete link more


Thanks you so much Tom

JSandu gravatar image JSandu  ( 2018-10-01 13:42:29 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2018-04-04 09:03:32 -0600

Seen: 454 times

Last updated: May 22 '18