Ask Your Question

Problem in visualize random walk in rviz and gazebo for Husky A200

asked 2015-08-24 04:08:40 -0500

ktiwari9 gravatar image

updated 2015-08-25 01:23:21 -0500

Hi all, I am trying to simulate a random walk behaviour for a husky A200 in both gazebo and rviz so that I can setup my world in gazebo and see the sensor feedback in rviz. i can see the husky perform random walk in Gazebo but in rviz i can only see the wheel spinning and the robot doesnt move. My source code is here. Please advise why the husky moves in gazebo but not in rviz.

My most recent tf tree is shown here: image description

edit retag flag offensive close merge delete


Do you have, in rviz set the fixed_frame to map? If it is base_link, you'll see everything relative to the robot, thus the wheels spinning, but the robot not "moving". You obviously need to have some kind of localization for this...

mgruhler gravatar image mgruhler  ( 2015-08-24 05:44:46 -0500 )edit

Hi @mig so currently my fixed_frame is set to base_link but do you mean I should change it to map and also I need localization or I only need to change the fixed_frame to map ?

ktiwari9 gravatar image ktiwari9  ( 2015-08-24 08:22:03 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2015-08-25 01:01:31 -0500

mgruhler gravatar image

I'll add this as an answer now:

The problem you have is (probably) that you visualize everything in rviz with respect to the base_link, i.e. the robot. This is why it looks like only the wheels are spinning in rviz. Thus, you have to set the fixed_frame to something that is fixed in the world.

You could use the odometry frame (usually odom) for getting better behaviour. Therein, the motion of the robot is accumulated as seen from internal sensors (wheel encoders, IMU, ...). However, those sensors are not accurate over time, so you'll receive a drift there as well.

Best way would be have a localization running and use the global frame as reference (usually map). You can use e.g. amcl for this, but you require a map. You could also write (or use, if one already exists) a node that directly takes the real world pose from gazebo and publishes the respective transfrom (map -> odom). There is the fake_localization package, that might be suitable for your usecase.

If you want to get information about your current transform (tf) tree, you can use rosrun tf view_frames which produces a PDF with the respective information. Or use rosrun rqt_tf_tree rqt_tf_tree which provides the same output but more dynamically and not in a PDF.

edit flag offensive delete link more


so as per your suggestion, I set my fixed_frame as odom now and my tf tree I added above. In order to add a link between map and odom I used robot_pose_ekf package. Is the tree correct ?

ktiwari9 gravatar image ktiwari9  ( 2015-08-25 01:24:17 -0500 )edit

Now I can see the robot move but the motion in rviz is completely different from the one in gazebo.

ktiwari9 gravatar image ktiwari9  ( 2015-08-25 01:24:40 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2015-08-24 04:08:40 -0500

Seen: 498 times

Last updated: Aug 25 '15