# Turtlebot not moving according to input (in simulation and reality)

Hello, I'm using a Turtlebot with Kobuki base and running Ros Hydro on Ubuntu 12.04.

I need the Turtlebot to move in the shape of a square and guiding myself by this codes: albany , turtlebot motion and turtlebot tutorials, I put together a code that only sends the appropriate cmd_vel messages to make the robot move at a certain speed for a chosen time and another one that uses tf information from odom->base_footprint to make the robot move a chosen distance at a certain speed.

I was first testing the codes on turtlesim and they seemed to work well, but when I tested them on the gazebo turtlebot simulator and on the real turtlebot this was the outcome (obtained from the /odom topic on rviz):

Gazebo Simulation - From the code that doesn't use tf:

Gazebo Simulation - From the code that uses the odom->base_footprint tf:

Real Turtlebot - From the code that doesn't use tf:

Real Turtlebot - From the code that uses the odom->base_footprint tf:

From what I've read the odom->base_footprint tf uses the wheel odometry and IMU data and the kobuki base already comes with a calibrated gyro, so I don't think the problem is in the odometry tf. Also, the movement of the turtlebot with the code that doesn't use the tf looks similar to the one the /odom topic shows.

The result from the real turtlebot is terrible, but I was surprised that even in the simulation the code doesn't work well. I don't really know how to improve this results, because I expected that the way to improve the first code was to use the one that implements the tf and that one is the one that works the worst.

Am I misusing the tfs or am I missing something particular about the turtlebot with the messages I'm sending? at least in the first code it looks as if the robot doesn't move as much as the speed would indicate but the tf should be indifferent to that. I read the similar questions in the FAQs but none of them help me solve this.

Thank you very much in advance and sorry for the rviz plots, I didn't find another way to get the trajectory of the robot.

The two codes are below:

Code that only sends the appropriate cmd_vel messages to make the robot move at a certain speed for a chosen time:

#!/usr/bin/env python
# http://library.isr.ist.utl.pt/docs/roswiki/rospy%282f%29Overview%282f%29Time.html
""" Example code of how to move a robot around the shape of a square. """

import roslib
import rospy
import math
import time
from geometry_msgs.msg import Twist

class square:  """ This example is in the form of a class. """

def __init__(self):
""" This is the constructor of our class. """
# register this function to be called on shutdown
rospy.on_shutdown(self.cleanup)

# publish to cmd_vel
self.p = rospy.Publisher('cmd_vel', Twist)
#twist ...
edit retag close merge delete

Sort by » oldest newest most voted

What are you plotting as the ground truth, I'm pretty sure that's the odometry and I think it's much better than your control algorithm. Your code is basically doing open loop control to hit a target angle and then assuming that the robot will stop immediately. Then driving forward at a known speed or distance and expecting it to stop immediately.

Your timing based turns consistently understeer due to not accounting for acceleration, and your tf based turns consistently oversteer due to not accounting for decceleration. Likewise the distances time based are similar or shorter than expected, while the drive until final distance go to far due not counting for accelerations and latency.

To get good following performance you should be continuously homing to your goal with a controller, and correcting for errors which build up as you drive.

more

I am plotting the output of the /odom topic (that is published when I run the turtlebot_gazebo for the simulation and minimal.launch for the real robot), I thought that topic was giving me the combined odometry of IMU and wheels.

( 2015-05-28 08:44:11 -0600 )edit

And thank you for pointing out the control flaw, I will rethink how I'm using the tf measurements to account for deceleration.

( 2015-05-28 08:46:02 -0600 )edit

This sounds like a homework assignment (which I say because I had the exact same assignment in my first robotics class), and you are discovering exactly what the assignment is trying to teach you. I'm not going to give you the answer, but I'll present a few things to think about:

• Is it possible to create a robot that will drive at exactly the velocity you tell it to?
• What happens if there's even a 0.0001% difference between the velocity you expect, and the velocity at which the robot is actually moving? How does that change over time?
• How good is the clock on your computer? What is happening in the condition rospy.get_time()-start<4? In one iteration rospy.get_time()-start<4 is true, and the next it is false, and a nonzero amount of time passed. What did your robot do in that time?
more

Hello, thank you very much for the answer. My main goal is to test how the libfovis and viso2 visual odometry packages respond to different lightning conditions and environments, but for that I wanted a repeatable test so I thought the turtlebot was the best choice

( 2015-05-27 11:12:33 -0600 )edit

Since I can compare the trajectory without odometry, with imu+wheel odometry and with the VOs. I thought the clock and loop errors wouldn't affect that much, since they didn't affect the turtlesim but maybe it got worse on gazebo because it is much heavier,

( 2015-05-27 11:17:34 -0600 )edit

You may be able to adjust things in gazebo to make this work better, but you're still going to run into discretization errors which will prevent things from working perfectly. If you want to run things on a real robot, you'll need something which provides better odometry.

( 2015-05-27 11:22:57 -0600 )edit

and I tought the the cumulative error from the velocity would affect the real turtlebot but not the simulation. I did expect an improvement when using the tf because I don't depend on time or on the acurracy of the speed anymore and the odometry seems to work better than what the tf code shows

( 2015-05-27 11:23:16 -0600 )edit

Sorry, I didn't read the last comment before. I thought the turtlebot's own odometry might be good enough since there are examples of people using the odom tf sucessfully to move in predefined trajectories, so maybe there's something wrong with my hardware instead of my code?

( 2015-05-27 11:30:14 -0600 )edit

Odometry itself is generally not very good. Can you check if robot_pose_ekf is running? That will integrate odometry and IMU to give you a better estimate, but I still wouldn't trust it too much.

( 2015-05-27 11:35:12 -0600 )edit

robot_pose_ekf is being called on the turtlebot_bringup/package.xml as run_depend, but the odom_combined->base_footprint frame is not being published, the only one that appears is odom->base_footprint. I assumed that the odom-base_footprint was coming from the ekf since I don't have other odometry

( 2015-05-28 08:33:44 -0600 )edit