avoiding obstacles with IR and Ultrasonic sensor in addition to laser data
I have turtlebot3 with imu, LIDAR and ultrasonic/infrared sensors. I want to build a map autonomously, and then navigate on this map, avoid obstacles such as glass, mirrored metal chair legs.
1) First question. For example, at first, I want to build a map autonomously. Am I supposed to use ultrasonic sensors here at this stage, for mapping? Should i give sensor_msgs/Range to mapping packages, such as gmapping, cartographer and others? Or its only for move_base?
2) Second question. At this picture, on the right, http://wiki.ros.org/move_base?action=... I see sensor_topics(sensor_msgs/LaserScan, sensor_msgs/PointCloud), but i dont understand, how it is supposed to give sensor_msgs/Range to move_base package?
Is there any detailed tutorials or examples about how to use data from laser and ultrasonic/ir sensors together to avoid obstacles?