ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Get odometry from wheels encoders

asked 2016-08-12 01:57:28 -0500

dottant gravatar image

updated 2021-10-12 09:51:11 -0500

lucasw gravatar image

Hi all, I have a problem to compute odometry from wheel encoders, I mean, i don't have a real problem, I just don't understand a step.
I get the ticks from the encoders (they are about 2626 ticks for a full wheel rotation) and I perform the odometry calculation in the following way(of course this is in a loop...i paste just the important lines about the odometry calculation)

current_time = ros::Time::now();
double DistancePerCount = (3.14159265 * 0.13) / 2626;
double lengthBetweenTwoWheels = 0.25;

// extract the wheel velocities from the tick signals count
deltaLeft = tick_x - _PreviousLeftEncoderCounts;
deltaRight = tick_y - _PreviousRightEncoderCounts;

omega_left = (deltaLeft * DistancePerCount) / (current_time - last_time).toSec();
omega_right = (deltaRight * DistancePerCount) / (current_time - last_time).toSec();

v_left = omega_left * 0.065; //radius
v_right = omega_right * 0.065;

vx = ((v_right + v_left) / 2)*10;
vy = 0;
vth = ((v_right - v_left)/lengthBetweenTwoWheels)*10;

double dt = (current_time - last_time).toSec();
double delta_x = (vx * cos(th)) * dt;
double delta_y = (vx * sin(th)) * dt;
double delta_th = vth * dt;

x += delta_x;
y += delta_y;
th += delta_th;

geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);

geometry_msgs::TransformStamped odom_trans;
odom_trans.header.stamp = current_time;
odom_trans.header.frame_id = "odom";
odom_trans.child_frame_id = "base_link";

odom_trans.transform.translation.x = x;
odom_trans.transform.translation.y = y;
odom_trans.transform.translation.z = 0.0;
odom_trans.transform.rotation = odom_quat;

// send the transform

// Odometry message
nav_msgs::Odometry odom;
odom.header.stamp = current_time;
odom.header.frame_id = "odom";

// set the position
odom.pose.pose.position.x = x;
odom.pose.pose.position.y = y;
odom.pose.pose.position.z = 0.0;
odom.pose.pose.orientation = odom_quat;

// set the velocity
odom.child_frame_id = "base_link";
odom.twist.twist.linear.x = vx;
odom.twist.twist.linear.y = vy;
odom.twist.twist.angular.z = vth;

// publish the message
_PreviousLeftEncoderCounts = tick_x;
_PreviousRightEncoderCounts = tick_y;

last_time = current_time;

The thing that I don't understand is why i have to multiply vx and vth for 10 (it is a value that i set randomly and seems to work) to make the odometry consistent with the movements of the robot (if i dont multply, the odometry will change very slowly compared to what the robot does in terms of meters done).
Maybe I'm doing some errors calculating the velocities.
An help would be very very appreciated, thank you.

edit retag flag offensive close merge delete


My i please know the source from where i can implement the same? I am also trying to retrieve the encoder values but not able to find anything useful so far. Regards

Ayush Sharma gravatar image Ayush Sharma  ( 2017-04-29 03:10:36 -0500 )edit

Can u please share the whole code for encoders plzz. I am really stuck and can't seem to find any solution ...

akay97 gravatar image akay97  ( 2017-05-20 02:59:56 -0500 )edit

Could I ask where you are getting the equations for your v and theta?

PG_GrantDare gravatar image PG_GrantDare  ( 2018-11-13 20:50:36 -0500 )edit

@dottant why have you multiplied vx and vth by 10?

sajal gravatar image sajal  ( 2019-05-30 07:50:28 -0500 )edit

This topic is old, there's no need to perform that, the formula was wrong cause I had to fix some hardware issue related to the IMU and I completely changed the code introducing a PID control for the velocities commands to the robot which now gives me smooth values and a quite perfect odometry (of course with a little slippage whose value can be seen only after a long time). Just remove the *10 and be sure to have perfect values from the IMU.

dottant gravatar image dottant  ( 2019-06-03 03:19:26 -0500 )edit

3 Answers

Sort by ยป oldest newest most voted

answered 2016-08-12 13:05:55 -0500

billy gravatar image

omega_left and omega-right is the wheel velocity components already. You do not need to do anything with radius in your v_left and v_right. That is where the error is.

NOTE: Calculations are fine for a simulation, but in real life you want to actually measure the encoder response to real movement. Your calculations will not account for slippage and slippage could be significant depending on wheels, robot weighting and surface.

Spin the robot around 10 times to the right and see how many counts on each wheel, then to the left, then 5 meters straight forward, then 5 meters straight back. And watch the path - if you send it straight forward does it really go straight? You may need different multipliers for each wheel, or for different directions.

edit flag offensive delete link more


Factor 10 feels too much for slippage, but I don't see anything wrong with the code either.

Humpelstilzchen gravatar image Humpelstilzchen  ( 2016-08-15 03:27:56 -0500 )edit

Agreed that slippage can't account for 10x. Wasn't meaning to suggest it, just pointing out that for actual robots, you need to measure. The 10x comes from the error I pointed out in his calculation (the error in the calcs actually is more like 15x)

billy gravatar image billy  ( 2016-08-15 21:14:58 -0500 )edit

missed that one, +1

Humpelstilzchen gravatar image Humpelstilzchen  ( 2016-08-17 12:30:48 -0500 )edit

Hi billy,

Do you have an example taking into account the slippage condition?

Thank you.

novak gravatar image novak  ( 2017-03-13 08:43:14 -0500 )edit

The term slippage is used to capture the idea that motion based on wheels is going to be inaccurate. If this slippage was predictable you could account for it in your motion calculations. So I'll say there isn't a way to do it in code. - continued next comment

billy gravatar image billy  ( 2017-03-13 12:44:35 -0500 )edit

Slippage however can be mitigated through the use of additional sensors that provide additional position/movement data. In ROS Navigation stack, Laser scans are used in the AMCL node to generate a correction to account for slippage and other errors. That correction is the map-to-odom transform.

billy gravatar image billy  ( 2017-03-13 12:48:57 -0500 )edit

Another sensor that can be used to mitigate slippage is an IMU which can help with rotational errors. Of course there are more such as GPS, cameras, ultrasonic, bumpers that can all be used to make corrections for slippage. Does that help?

billy gravatar image billy  ( 2017-03-13 12:51:30 -0500 )edit

Thank you! I was confused because I thought that it has a specific solution. In my case I will use odometry and IMU data to obtain an estimated position and fuse them through robot_localization module to get an odometry/filtered output.

novak gravatar image novak  ( 2017-03-13 13:47:39 -0500 )edit

answered 2018-07-18 06:55:42 -0500

dottant gravatar image

This is really old.
Thanks a lot about the answers, but i fixed it more than an year ago and now my robot can move autonomously without problems.

edit flag offensive delete link more



What was your solution?

jayess gravatar image jayess  ( 2018-07-18 09:28:44 -0500 )edit

I had to fix some hardware issue related to the IMU and I completely changed the code introducing a PID control for the velocities commands to the robot which now gives me smooth values and a quite perfect odometry (of course with a little slippage whose value can be seen only after a long time)

dottant gravatar image dottant  ( 2018-07-19 13:39:29 -0500 )edit

Do you mind sharing your code? I'm having some bug in my code.

huang27c gravatar image huang27c  ( 2018-07-24 13:33:49 -0500 )edit

hello sir ! how you accomplish wheel odometry so successfully please share your method and code as for guide line for my project and recommend some authentic source for wheel odometry and visual odometry . Thanks in advance

m rehman gravatar image m rehman  ( 2018-09-18 15:39:52 -0500 )edit

Do you have the arduino encoder code you use next to this code for a real application? If yes, could you please share it? @huang27c@dottant

Mekateng gravatar image Mekateng  ( 2020-02-24 14:44:06 -0500 )edit

answered 2017-03-10 00:16:01 -0500

Veera Ragav gravatar image

updated 2017-03-10 00:17:54 -0500

Instead of finding omega_left and omega_right to calculate v_left and v_right, You can calculate v_left and v_right from deltaLeft.

        double DistancePerCount = (2 * 3.14159265 * 0.065) / 2626; // (2*PI*r)/ppr
        double lengthBetweenTwoWheels = 0.25;

        deltaLeft = tick_x - _PreviousLeftEncoderCounts;
        deltaRight = tick_y - _PreviousRightEncoderCounts;

        v_left = (deltaLeft * DistancePerCount) / (current_time - last_time).toSec();
        v_right = (deltaRight * DistancePerCount) / (current_time - last_time).toSec();`
edit flag offensive delete link more

Question Tools



Asked: 2016-08-12 01:57:28 -0500

Seen: 36,988 times

Last updated: Oct 12 '21