ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Improve turtlebot localization in simulation for real world experiments

asked 2019-08-01 07:54:02 -0500

Tima1995 gravatar image

updated 2019-08-01 08:00:05 -0500

Hey,

I am right now trying to implement some different reinforcement learning algorithms to my Turtlebot for navigational tasks. In preparation for transferring those after training to the real turtlebot, I am right now trying to make the localization "more real". Due to friction and slip effects, the robots simulated odometry data is not suitable to deliver "realistic" cooardinates for planning the path. Therefore I am looking for ways to make the simulation more realistic. I already have some thoughts about it:

  1. I found out about the libgazebo_ros_diff_drive.so plugin which is used in simulation as a controller. There I can switch

    <odometrySource>world</odometrySource> to <odometrySource>encoder</odometrySource>

    and I will get data from the encoder integration instead of from the gazebo world (and no perfect coordinates). Thus I hope I can get effects like slipping better into the simulation

  2. Right now I am also thinking about using tf as well. If I have a look in RVIZ, I can see that the locations of the cooardinate systems of the Map and Odom start to differ with time. Here it might be useful to have a look at the rqt_tf_tree which has the followed structure (with encoder setting):

    map => odom => base_footprint => base_link => wheel_left_link, wheel_right_link, lmu_link, base_scan, center_back_link

    Would it here be sufficient to connect the both map and base_link directly in order to get a higher reality factor?

Would be great if you may also have some ideas of how to make the localization of the robot more realistic.

Kind regards Marcel

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
2

answered 2019-08-01 11:52:00 -0500

I might recommend modifying the plugin for some better noise/slip modelling. That publishes the real position of the robot, maybe add some noise in in accordance with the noise you see on your encoders. Modelling accurate weight/inertial/acceleration/coef. of friction of materials of the robot and environment will be critical for gazebo to do its job and give you reasonable slipping. With those 2 things you can probably get something pseudo-realistic, but I sort of doubt the slipping will be consistent since I don't believe anything in gazebo models multi-point-of-contact tires effectively.

http://docs.ros.org/jade/api/gazebo_p...

edit flag offensive delete link more

Comments

Thanks for your answer!. Yes that is definetly something I will try to make better. But still I have a problem when getting the localization data. If I get them from Gazebo, they are perfect and not to compare with those which are collected by the odometry of the real Turtlebot. Thats why I am thinking about switching localization from Odometry to something like the AMCL package to get data from the scan data.

Tima1995 gravatar image Tima1995  ( 2019-08-02 03:12:31 -0500 )edit

Yes, you can definitely do that. Localization is one of those fields I tend to not test in simulation because as you mention, its hard to model it well, but robots (at least in my experience) are plentiful for testing data. But obviously doesn't work for your online training.

Best of luck!

stevemacenski gravatar image stevemacenski  ( 2019-08-02 13:01:30 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2019-08-01 07:54:02 -0500

Seen: 514 times

Last updated: Aug 01 '19