ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

move_base local planner deviates significantly from global planner path

asked 2020-12-17 11:48:48 -0500

tootyboi95 gravatar image

updated 2020-12-17 23:36:52 -0500

Hi ROS community,

I am in need for some help with a project involving the use of move_base package with Ubuntu 16.04, ROS Kinetic. I have looked through most of the other issues with move_base that others have faced, but I did not encounter any similar issues in the forum. Help is greatly appreciated!

I am currently running this on a robot dog which I recently owned (Unitree A1). This robot has an internal IMU which allows me to derive odometry base_link - odom tf as well as to publish ‘/odom’ message, using this ROS tutorial here (link). In fact, I used this, along with laserscan from a 2D Lidar, for HectorSLAM and AMCL package and it worked well. I am also able to teleops the robot dog using the Teleops Twist Keyboard package. Here is a rough image of the tf of the system (Note: This is not the latest diagram. I have attempted to keep all nodes publishing at 10Hz).

tf frames: https://drive.google.com/file/d/1mC1M...

Successful AMCL video: https://youtu.be/BET4nThOA5E

I have spent a long time attempting to tune the move_base parameters for this package. While the global planner seems to do its job of planning a relatively straight path from start to goal pose, the local planner is causing the robot to move in very unpredictable manner, often deviating from the global path. Here are some videos of RVIZ describing the issue. I have mapped an office environment using a RPLidar, and have initialized the AMCL package with origin: x=0, y=0, theta=0 (At the origin where I started mapping, facing Eastwards). In all of these videos, the arrows corresponds well with the real (observed) location of the robot. There is a lag seen because I am viewing RViz on a server that is remote from the robot.

Failed Attempt 1: https://youtu.be/zjztpArHyRk As you could notice, the robot seem to do wierd rotations at the starting pose before hitting a dead-end that is deviating even further from the goal pose.

Failed Attempt 2: https://youtu.be/CUEi0Uyg1lg You would notice that the robot spins off the path and faces a wall again.

Successful Attempt 3: https://youtu.be/IDlVw3i-xfI Out of about many attempts today, only 1 of them seem to work relatively well. Even so, it seems to deviate from the goal pose once it is close to it.

I have looked through most of the other posts regarding issues with move_base. There was a post explaining how to resolve the issue of robots moving in random circles (link). However from the 2nd video (failed attempt), you could see that I displayed the axis for map coordinate system. I have looked through the coordinate system for odom, base_link and laser, and all of them are facing the same way (x facing Eastwards, y facing northwards, z facing out of the plane). So this ensures that my robot’s base_link tf is oriented in the right ... (more)

edit retag flag offensive close merge delete

3 Answers

Sort by » oldest newest most voted
1

answered 2021-04-05 16:28:58 -0500

buzzport93 gravatar image

updated 2021-04-06 23:04:59 -0500

jayess gravatar image

Hello!

I am also working with Unitree A1's slamtec package, and have been struggling to get there navigation stack to work with 2D Goal! Every time I send command via 2D Goal on rviz, the robot seems to be confused (often times failing to get to the destination I have asked for), and the times it works, the robot takes a quite a while to get to the goal (it spins left and right, readjusts itself way too much).

I am assuming that you have been having similar issues, and I am starting to question if the generated cmd_vel is designed for wheeled robots like turtlebot. I wanted to write my own cmd_vel publisher for quadruped, but I am not sure how to integrate it with the output of global/local planner.

I would really appreciate if I can hear back from you and your progress, as I think knowing someone who is working on A1 would be nice in the progress moving forward. I am part of Georgia Tech, and I am in the process of making a slambox with ouster lidar, vectornav imu, and Intel NUC so that I can correct for odometry drift.

Sincerely,

edit flag offensive delete link more

Comments

1

Hi buzzport93,

No problem at all.

I would need more information. What aspect of the slamtec package you are using? I believe it comes with both the SLAM and the planner (or navigator) package.

If you are using the ROS Navigation Stack instead of Slamtec navigation stack, debug using these steps:

1) Is the global planner giving a reasonable path to the goal? If so, move on.

2) Is the local planner of the navigation stack not planning properly? Verify by sending a 2D Goal and visualizing the small line output by the local planner displayed on RViz. You can make the length more visible by increasing sim_time and min_vel. If the local planner seem to give local plans that are very inappropriate, you would need to refine the parameters.

3) If (2) is ok, it might be indeed the cmd_vel sent by the planner to the robot. You might ...(more)

tootyboi95 gravatar image tootyboi95  ( 2021-04-06 21:02:05 -0500 )edit

Hi Tootyboi95. Good to hear from you!

I am not sure how your A1 was configured, but we got it with SLAMTEC S1 lidar option. Therefore, when we received the robot, catkin_ws/src was already equipped with SLAMTEC's sdk that uses ROS costmap_2d & navfn, which is what I am using.

So yesterday, I finally got the robot to walk to chosen locations after a critical fix.The fix was changing base_local_planner to dwa_local_planner (not the original 'inclusive' dwa = true). After adjusting some parameters (velocity & acc), I got to achieve this: https://www.youtube.com/watch?v=ibmLm....

The most critical issue with that video above is that robot doesn't know when to stop properly for heading angle. I intended the robot to move to that chair, so that worked out nicely, but the robot seem to be confused for final heading angle. Is this just going ...(more)

buzzport93 gravatar image buzzport93  ( 2021-04-07 19:54:54 -0500 )edit

If you have used amcl, I would like some help! I have already generated pgm & yaml file for my environment (the place you see in youtube video), and I would like to have robot navigate the area autonomously with the knowledge of map already in the 'brain'. I have looked for several tutorials, but they kind of seem confusing. What are the necessary changes you need to make for your launch files and yaml parameter files? Any reference / help would be greatly appreciated. Once again, thank you :)

buzzport93 gravatar image buzzport93  ( 2021-04-07 19:57:26 -0500 )edit

Hi,

I'm not too familiar with Slamtec's API, but I think it might use something similar to HectorSLAM for mapping, since it doesn't require any odometry input. This API would also only work with rplidars. Since I had issues with my rplidar, I swapped out of it and used my own Autonomy pipeline instead of the Slamtec one.

tootyboi95 gravatar image tootyboi95  ( 2021-04-10 23:57:36 -0500 )edit

As for the poor robot convergence issue, it seems like the robot is unable to converge accurately to the target pose, hence the continuous readjustment. You might want to consider increasing the goal XY and angle tolerance. I could set this using ROS navigation stack, but I'm not sure if you can do so using the SLAMTEC API.

tootyboi95 gravatar image tootyboi95  ( 2021-04-10 23:59:16 -0500 )edit

As for AMCL, perhaps you would consider something like this: https://github.com/tue-robotics/robot...

Note that you would need a map server to load the pgm and yaml map files and publish it out as topic for AMCL to subscribe to.

As for specific parameters, you should definitely check out the AMCL wiki page. You should make sure that the topics are defined correctly for proper topic subscribing and publishing: http://wiki.ros.org/amcl

Hope it helps!

tootyboi95 gravatar image tootyboi95  ( 2021-04-11 00:39:56 -0500 )edit
1

answered 2020-12-30 05:10:03 -0500

miura gravatar image

I'd like to comment on something that caught my attention.

vx_samples and vtheta_samples are real numbers, but they should be parameters that take integers. It may be that move_base is not interpreting the parameters properly.

I think acc_lim_theta, max_vel_theta, and min_vel_theta are too small. You may not be able to generate the angular velocity needed to turn.

edit flag offensive delete link more

Comments

2

Hi Muira, thanks for your feedback! I eventually came to realize that the robot is acting strangely because of its natural drift as a result of its IMU. Correcting that helps alot.

I also had the chance to increase the theta limits and that helps alot. Thanks again!

tootyboi gravatar image tootyboi  ( 2021-01-01 00:06:39 -0500 )edit
0

answered 2022-11-25 00:26:37 -0500

If any of you are still actively working on Unitree dogs, please stop by and see us on Slack, we've got a small community of Unitree users struggling through development issues together. Take care. https://join.slack.com/t/robotdogs/sh...

edit flag offensive delete link more

Question Tools

2 followers

Stats

Asked: 2020-12-17 11:23:23 -0500

Seen: 1,156 times

Last updated: Nov 25 '22