translate between tf within a single robot, odometry on moving part.

Hi there everyone, I hope my question is understandable and answerable.

Before I get to my question, I'll attempt to give some background to help frame it. I have built an articulated four wheel robot. I have designed an accurate 3D model and created a URDF for it. I have two quadrature encoders (one on each non driven front wheels), its rear wheel drive via differental. My plan is to use these encoders to calculate odometry for the base_link. The base_link is the center pin which the machine articulates around.

Now my problem is, when you articulate, all four wheel move in different directions. If your articulating to the left for example (as shown in images), the right hand front wheel rolls forward and the left hand front wheel rolls backwards and the opposite for the rear section. The center pin moves outwards a bit because the robot is bending in half.

So here's the question, since I have the URDF and a joint_state_publisher running which is being updated from the appropriate joint angles of the robot from various nodes, is it possible to translate these encoders position/distance (from front wheels) back to base_link (center) to build odometry?

I have this odometry file which I manually calculated the odom, based on center pin angle and encoder distance traveled, which works ok, but only for one encoder and not as accurate as my new URDF model would be and if i used robot_localization i could generate two odoms and combine. Also if one wheel gets stuck and cant turn freely, the position of robot will be wrong, but with two encoders the other wheel's odometry would still be tracking and correctly positioning the base_link center pin.

odometry.py

# file not complete, only the calculation part shown for clarity
while not rospy.is_shutdown():
last_time = current_time
rospy.sleep(0.05)
current_time = rospy.Time.now()

# read current articulation angle for center pin
angle_new, success, raw = drive.getAngle()
if angle_new >= -0.5:
angle = angle_new
else:
rospy.loginfo("angleGet failed: %s raw %s", angle_new, raw)

# read current distance travelled from front right encoder
#  note: each time the distance is read its automatically reset to zero by 'simple_drive' node
dist = drive.readDistance(True)

# Create all variables for odometry
# first theta angle from simple_drive
phi = angle
dt = (current_time - last_time).to_sec()
vel = dist / dt

# update our robot model
model_joints.header.stamp = current_time
artic_angle = -1 * phi

# joint state publisher updates model joints positions
model_joints.position = [-artic_angle,0.0,vel,vel,artic_angle,vel,vel]
pub.publish(model_joints)

# if the articulation angle approaches zero, turning radius approaches infinity
if phi < 0.01 and phi > -0.01:
delta_x = dist * cos(th)
delta_y = dist * sin(th)
delta_th = 0
else:
# turn_radius 'ra'
turn_radius = (centre_to_front + (centre_to_rear / cos(phi))) / tan(phi)
# thetaA angle
phiA = atan((centre_to_front/turn_radius))
# then velocity or omega 'w'
vth = vel / (turn_radius + front_centre_to_wheel)

# compute odometry in a typical way given the velocities of the robot
delta_x = -vth * centre_to_front * cos(th + phiA) * dt + vth * turn_radius * sin(th + phiA ...
edit retag close merge delete

Comments

Please attach your images to the post itself, instead of linking to a Google drive share.

If (or when) your share disappears, this question (and any potential answers) will be much more difficult to understand.

I've given you sufficient karma.

( 2021-01-09 10:58:08 -0600 )edit

I appreciate that linking to Google drive is not a preferred option but at the time I didn't have enough karma points. Thanks for allowing me to upload images.

( 2021-01-09 16:02:24 -0600 )edit

Hi everyone, I was just wondering if anybody had had any suggestions for this question? Thanks again.

( 2021-01-13 04:56:16 -0600 )edit