ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

How to tell if odom localization is "good enough" for the nav2 controller

asked 2022-07-22 08:51:36 -0600

hermanoid gravatar image

We're using the Nav2 stack for ROS 2 Galactic, and we've been having some serious problems with navigation. We're using VSLAM for our global positioning and wheel encoders for local positioning, and we're seeing very glitchy behavior from the controller (lots of jerking around, and it has a hard time deciding that its pointing in the right direction). Furthermore, we have troubles where the robot dramatically deviates from the path our SMAC planner finds around corners by up to 12 inches, often causing the robot to collide with obstacles. We are not using the robot_localization package because we are not sure if it is necessary.

Using RVIZ, we can tell that the position the global planner is getting seems quite good, but we don't know how to verify if the odometry position we're feeding into the controller is "good enough" for it to follow the path smoothly and accurately. Setting the static frame in RVIZ to "odom" doesn't really give enough information - visually, the position looks fine enough, but we can't see how the controller is reacting to that position.

RVIZ allows us to "see" what the global planner is seeing to help diagnose issues with it. Is there a setting within RVIZ or even a dedicated tool we can use to "see" what the controller is seeing? Or does the situation I described scream out some obvious problem with our configuration?

edit retag flag offensive close merge delete

Comments

1

What controller? Also, if you get your robot and make it follow a path around a known space (e.g. start from a pose, drive around for a few minutes, then end back at the pose) how much odometric drift are you seeing?

WIthout IMU, your orientation might not be great. Fusing IMU + encoders is the most common use of R_L for mobile robots, since that helps manifestly.

stevemacenski gravatar image stevemacenski  ( 2022-07-22 14:41:08 -0600 )edit

Thanks for the tip, Steve! Getting our robot to navigate in a stable fashion has been an ongoing struggle, unfortunately. We decided to go down the sensor fusion route, but we've yet to get it fully stable, mostly because most of our odometry sources have issues like not following REP-105, not producing covariance matrices, etc. In the simulation where we got fusion working (because the simulated sensors have none of the issues of real life), we had much more success using DWB, and not RPP, when turning (it's much less prone to jerking around or missing the target). But, DWB still struggles on the tight corners we must make into narrow lanes, probably because of the elongated footprint of our robot. I'll likely be making a separate question for that soon.

hermanoid gravatar image hermanoid  ( 2022-08-04 09:39:33 -0600 )edit

If anyone does have a tip for viewing/debugging the controller, I'm still interested in finding ways to debug it. The controllers seem to have all sorts of options for producing more information than just the trajectory you can view in RVIZ, but I'm not sure how to use it productively.

hermanoid gravatar image hermanoid  ( 2022-08-04 09:40:40 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted
2

answered 2022-08-16 16:26:25 -0600

hermanoid gravatar image

updated 2022-08-16 16:28:54 -0600

For anyone looking here for a really nice way to debug the controller... Sorry, I've got nothing for you. Here's what I ended up doing.

Our global position data was provided by ORB VSLAM 2, and it was good enough to be fused directly with IMU and wheel odometry data in robot_localization and used as both our global positioning and local positioning provider. We set up a static zero transform from map -> odom and pointed robot_localization at odom -> base_link. By doing this, we could see from the global frame everything that was going on in the local frame, because there was no difference. This allowed us to figure out what was causing the controller to act so erratically.

Basically, our localization was awful because of a weird orientation glitch (and because I know what you're all thinking, no, it was not related to incorrect quaternion math. Probably). Because of this, it was subject to error of +- 0.25 meters. That was the core of the issue, but fusing in IMU as Steve recommended was also necessary before we started to see decent results when navigating in any scenario except a straight line. The Adafruit BNO055 worked swimmingly for this.

edit flag offensive delete link more

Question Tools

3 followers

Stats

Asked: 2022-07-22 08:51:36 -0600

Seen: 773 times

Last updated: Aug 16 '22