Ask Your Question
1

Does move_base require Twist from Odometry?

asked 2019-08-19 16:39:45 -0500

parzival gravatar image

I'm trying to make a turtlebot like robot using only the RPLidar. Hence I donot have a good source for odometry. I'm getting the Pose2D using laser_scan_matcher, and can use the values and represent it in nav_msgs/Odometry.

I've also found a way to use these pose2D values with /scan headers to roughly calculate velocities.

But I want to know is it mandatory to have Twist information for move_base to work? Or does move_base only require Pose from the odometry messages?

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2019-08-20 00:31:05 -0500

PapaG gravatar image

updated 2019-08-22 17:02:13 -0500

Hi,

move_base doesn't require any odometry as per the diagram in the document (couldn't get the image link to work). However, it is the local_planner that requires information about your motion (specifically velocity so that it can extrapolate a dynamic path). I am unsure however, move_base might require a transform from base_link->odom->map. This should be done anyway if you have an odometry message and is definitely a pre-requisite before attempting navigation. Of course, the local planner is a part of move_base but I would segment it if you're implementing it so you can identify issues in a particular section.

Perhaps you could use robot_localization to provide this transform and give an odometry message after accounting for noise. I have never used laser_scan_matcher myself, however, perhaps you could use an Iterative Closest Point (ICP) estimate if you aren't happy with the results of the CSM estimate?

edit flag offensive delete link more

Comments

Hi, yes robot_localisation should work too. I'm currently getting good results with laser_scan_matcher with a custom odometry node, so I'll continue playing around with that for a while. Thanks a lot! You've been great help!

parzival gravatar image parzival  ( 2019-08-23 00:16:39 -0500 )edit

@parvizal, could you provide a link to your custom odometry node? I take it you're taking odom from tf and converting it to nav_msgs/Odometry, using good-enough values for covariance? I'm doing exactly what you're doing (with an RPLidar as well :) ) and came to the same conclusion.

Edit: odom is only published in tf when you run the laser_scan_matching with gmapping demo :) So I suppose you're just using the PoseStamped msg and v=(x-x0)/dt to get twist, and publishing that ..

pring gravatar image pring  ( 2020-04-18 13:54:39 -0500 )edit

Wrote this up as a base but can't test yet due to broken robot :/

    pose = data.pose
    ts = data.header.stamp
    #https://answers.ros.org/question/69754/quaternion-transformations-in-python/
    euler = tf.transformations.euler_from_quaternion(pose.orientation)
    yaw = euler[2]
    # different method
    #odom->pose.pose.orientation.z = sin(yaw_/2.0)
    #odom->pose.pose.orientation.w = cos(yaw_/2.0)

    dx   = pose.position.x   - self.prev_x
    dy   = pose.position.y   - self.prev_y
    dyaw = pose.position.yaw - self.prev_yaw
    dt   = ts                - self.prev_ts

    odom = Odometry()
    #odom.header = 
    #odom.child_frame_id = 
    odom.pose.pose = pose
    odom.pose.covariance[0]  = 0.2 # x, covariances here are likely widely off
    odom.pose.covariance[7]  = 0.2 # y
    odom.pose.covariance[35] = 0.4 # yaw
    odom.twist.twist.linear.x    = dx/dt    
    odom.twist.twist.linear.y    = dy/dt    
    odom.twist.twist.angular.yaw = dyaw/dt
pring gravatar image pring  ( 2020-04-18 14:55:09 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2019-08-19 16:39:45 -0500

Seen: 291 times

Last updated: Aug 22 '19