ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
4

Navigation Stack with omni-wheel Robot

asked 2015-02-08 01:36:18 -0600

karamba gravatar image

updated 2015-02-08 01:40:39 -0600

Hi everybody,

I worked through the navigation stack tutorial yesterday and now I am wondering how to get the stack running on my 3-wheeled omni-wheel robot equipped with a Arduino Nano, an old Laptop, and an MS Kinect. The ROS stack is already running as well as freenect. (It may look a bit like Robotino)

It would be great if you would give me some hints how to get it running. I am especially confused how to setup the 3 omni-wheel control.

Thanks very much in advance! Karamba

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
2

answered 2015-02-21 15:21:24 -0600

paulbovbel gravatar image

updated 2015-02-23 07:25:00 -0600

I'm assuming by ROS stack running, you mean that you have a base driver that takes in cmd_vel, and spits out odom and tf?

For converting Kinect data into laserscans for navigation, take a look at https://github.com/ros-perception/per... . For Kinect, you'll have to use openni/freenect instead of openni2.

There's some sample navigation stuff for another omni-wheel robot here: https://github.com/paulbovbel/nav2_pl...

The important distinction is that your robot is holonomic, so you can give it an x, y, theta move command. Diff drive on the other hand would only take x, theta.

If you're gonna reuse the configs there, make sure you tweak speed/acceleration limits, footprint, and anything else robot specific. There's some amcl and gmapping demo bringups in https://github.com/paulbovbel/nav2_pl... as well.

edit flag offensive delete link more

Comments

Hi, I'm also trying to get the navigation stack working on a 3 omni-wheel robot. Do you actually have to write a node that translates changes in motor encoder positions to an odometry message ?

Cyril Jourdan gravatar image Cyril Jourdan  ( 2018-11-22 09:16:01 -0600 )edit
-1

answered 2021-03-14 01:32:13 -0600

updated 2021-03-14 07:04:48 -0600

If you are planning to do it using only encoders. Then you can refer to it. You can use gazebo for getting odom by supplying real world data to gazebo through controllers in ros. You can get some ready made repo for simulation once you do so, you can use real world data and publish on velocity controllers. It will get you almost accurate data on topic publishing odom of robot in gazebo. https://youtu.be/ihuSkFIn7YU

edit flag offensive delete link more

Comments

Please do not post link-only answers.

gvdhoorn gravatar image gvdhoorn  ( 2021-03-14 01:42:33 -0600 )edit

is it ok now?

Abhishek Ove gravatar image Abhishek Ove  ( 2021-03-14 04:27:28 -0600 )edit

Well, if the video ever disappears, would future readers still be able to understand what they should do? As in: are there clear steps to follow which show "how to setup the 3 omni-wheel control"?

gvdhoorn gravatar image gvdhoorn  ( 2021-03-14 06:43:53 -0600 )edit

is it better?

Abhishek Ove gravatar image Abhishek Ove  ( 2021-03-14 07:05:22 -0600 )edit

I have about 9 years experience with ROS, but I don't really understand what you're trying to say.

what does:

You can use gazebo for getting odom by supplying real world data to gazebo through controllers in ros.

mean exactly? Gazebo is a simulator, how would that use "real world data through controllers"?

gvdhoorn gravatar image gvdhoorn  ( 2021-03-14 10:03:37 -0600 )edit

Question Tools

3 followers

Stats

Asked: 2015-02-08 01:36:18 -0600

Seen: 3,382 times

Last updated: Mar 14 '21