ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

1) Usually mapping packages only take into account laser information and it is common place to ignore ultrasonic/infrared sensors during this stage. If you want the obstacles detected by your ultrasonic/infrared sensors to appear on the map, then you would need to feed them to the mapping process. For instance cartographer accepts laser readings, imu and point clouds as input. You could transform your range messages to point cloud and pass it to cartographer.

2) The image that you mention is a bit old, move_base uses costmap_2d to generate the global and local costmaps. costmap_2d can accept many different types of data sources and luckily there is a plugin for range sensors that you can use: range_sensor_layer

I don't know about a detailed tutorial to use laser and ultrasonic/ir sensors together. But if you can get your system working with the laser, then it is quite straight forward to add range sensors.