Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

In short, just don't put that node in your launch file.

Assuming you successfully have your robot working in Gazebo, you may already have the purpose of this launch file covered. This is just a very general template. It's up to you to make sure you have the basic functionality of your robot (Robot Setup) before trying navigation. The main things are:

  1. Something publishes sensor data.
  2. Something publishes odometry.
  3. Something publishes the necessary transforms.
  4. Something listens to Twist messages and moves your robot.

If your simulation launch file provides these things, you're all set. As you mentioned, Gazebo plugins can handle your sensors. Gazebo can also handle the odometry and controller. The transforms are usually handled by a robot_state_publisher node, which may be in your Gazebo launch file as well.

On a side note, if you're using just a camera, you'll need to do some extra work to get the necessary data for navigation since the Gazebo camera sensor plugin output type is sensor_msgs/Image. From Sensor Information:

The navigation stack uses information from sensors to avoid obstacles in the world, it assumes that these sensors are publishing either sensor_msgs/LaserScan or sensor_msgs/PointCloud messages over ROS.