Numerous navigation packages questions (including videos)

asked 2017-08-12 12:28:17 -0600

StevenCoral gravatar image

Hello all, I want to first share my project in a few sentences as background to my questions, I hope you find it interesting. I would also share the code once I feel that it is without uncertainties.

I was fascinated by the fact the the Neato XV is easily hackable, but didn't like the fact that I am bound to the controls of it's built-in motherboard commands. So, being a control engineer, I decided to make my own robot driver just to be able to "turtlebot" the platform. Theres much to it, and this came out a little long, but here's the main juice:

I started with creating a full URDF/SDF, roughly measuring the dimensions of the platform, wheel distances, diameter, LIDAR position, and also added a kinect on top for fun. It is an intentionally low-quality 3D design, didnt want to waste time on it. along with it I've made a gazebo world (willow garage office), Rviz, and a tailored navigation stack bundle.

After the POC worked out fine, I took the Neato apart and virtually emptied it from almost everything that wasn't crucial to moving the platform around. Mapped all the pins from the motors and encoders (and lidar and almost everything else), wrote a python utility for Raspberry-Pi to read all kinds of encoders using interrupts, accepting PPM signals from RC receivers, and a full differential driver code for the Neato XV (not exactly cutting-edge, but again, just wanted to see it work. I have plans for migrating this into a general ROS/standalone code for different platform types). The neato can then drive around using RC or cmd_vel topic, chosen by an auxiliary channel on the RC.

In conjunction with my code, I've used the node shown here to publish the laser scan topic (thanks and credits to the author, it works great). I did need to write another node and use the "rpms" topic to keep the lidar rotation in the correct velocity because mine was probably too dusty to operate smoothely (2nd hand vacuum cleaner I got online).

So first, you may watch the video of the simulation . I'll just point out that it was made for people with zero familiarity with robotics. I would appreciate any comments, and would appreciate them even more if they were professional (as in you may have seen something I missed, being an end-user of the navigation stack rather than knowing the code).

After this, I measured my living room and made a map of it by hand. gmapping did not perform flawlessly when I tried it with turtlebot_gazebo, and my own robot has it's own problems, mainly with dead-reckoning accuracy (kinda egg-and-chicken, I want to localize myself better using a map, but I need to have reasonable localization in order to create the map using the robot). Anyway, the same simulation in my "living room" that I also built in Gazebo went well as well.

Now is ... (more)

edit retag flag offensive close merge delete

Comments

3.) is strange. The darker color looks like your local costmap, the lighter like the global one. I have no idea why the local costmap was not initialized in the first 55s. Might be timing or bandwidth problem?

Humpelstilzchen gravatar image Humpelstilzchen  ( 2017-08-13 03:39:27 -0600 )edit

For 1.) we can clearly see "bad" odometry at work here. I recommend to calibrate the odometry a bit more for the rotation part. AMCL usually applies only small corrections. I think you can see them as little jumps in the video.

Humpelstilzchen gravatar image Humpelstilzchen  ( 2017-08-13 03:44:41 -0600 )edit

Also try to tune the odom_alpha parameters of amcl so it trusts the odometry a little less. Nice work and awesome question btw.

Humpelstilzchen gravatar image Humpelstilzchen  ( 2017-08-13 03:45:21 -0600 )edit

Many thanks, I'll try that. Its just gonna take some time to clear the room again :D

StevenCoral gravatar image StevenCoral  ( 2017-08-13 12:04:01 -0600 )edit