Ask Your Question

Start odometry with bias depending on Pose

asked 2018-04-11 04:26:38 -0500

kwint gravatar image

Hi, I want to fuse two localization methods, odometry and a PoseWithCovariance sensor. When the robot "starts" it already has a location and rotation from the PoseWithCovariance sensor (like x=4 and y=3 etc) But the odometry is (0,0,0, etc), how do I make sure that both systems start at the same position?

Of course, the two coordinates will change and be different from each other as time goes, which is expected. To get an accurate position I'm going to use an EKF from robot_localization. So I don't need a constant transform between the two, just only at the start.

My odometry uses the odom and base_link frames (as the standard) and the Pose messages has the frame_id odom.

What is the best way to do this?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2018-04-11 07:57:33 -0500

stevejp gravatar image

In the robot_localization yaml config file you can specify the "use_relative" parameter for your pose sensor (e.g., pose0_relative: true). This will make it fuse all the measurements from that sensor relative to the initial measurement. For example, if your first and second pose0 measurements were (4,3) and (5,1), it would fuse these as (0,0) and (1,-2).

On another note - be careful with fusing absolute position measurements from multiple sources. You might consider only fusing velocity measurements from your odometry, or using the r_l "use_differential" parameter for your odometry.

edit flag offensive delete link more


So if set the pos and odom to use relative they both start at (0,0) right? I'll look into the velocity thing. Thank you!

kwint gravatar image kwint  ( 2018-04-11 08:17:38 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools



Asked: 2018-04-11 04:26:38 -0500

Seen: 119 times

Last updated: Apr 11 '18