ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Robot navigation using ultrasound and IR sensors

asked 2015-03-24 13:47:55 -0600

Naman gravatar image

updated 2015-03-25 08:30:51 -0600

Hi all,

I have a mobile robot with multiple ultrasound and IR sensors on the front, side and at the back. It also has Rotational Encoders to generate Odometry data and IMU/GPS for localization. I would like the robot to navigate around the building with a given map using these sensors only (I don't have a laser sensor). My doubt is how can I interpret data from Ultrasound and IR sensors and use that for navigation. So, I would appreciate if someone can point me in the right direction and tell me about existing ROS packages, relevant links and other resources which will be helpful for the above mentioned problem.
I know this is a very general question and I am not looking for specific answers but just pointers to a good source from people who have worked on similar problems before.

Edit 1: One more thing, is it possible to improve localization using proximity sensors (something similar to amcl which uses laser scan). Maybe, generate fake laser scan data using proximity sensors and then use amcl to improve localization of the robot or maybe some other way. Does anyone has some insights on this?

Thanks in advance.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2015-03-24 14:45:48 -0600

aak2166 gravatar image

The problem you are describing is one of robot localization. See the amcl package and Robot Localization packages. The latter supports multiple sensor types, while I believe amcl relies on laser scan data. Usually localization happens concurrently with mapping in the SLAM problem (simultaneous localization and mapping) and laser scan/GPS/IMU sensors are very useful.

This problem will be easier if you can guarantee a known start position on your map, and are able to collect odometry data from the robot. The sensors you describe are difficult to use for this problem. A laser scan can update the map reliably, then compare it to how far the robot move based on its odometry data and it's GPS/IMU readings. Taken in totality we then have a much better idea of where the bot is.

In your case it almost seems easier to use dead reckoning for navigation and those sensors for obstacle detection and avoidance. Hope this helps.

edit flag offensive delete link more


I already have a map and I can set the initial position of the robot. Also, I can get the odometry data from the wheel encoders. For localization, I have IMU/GPS (edited the original question). The problem is how can I navigate around the building using only proximity sensors (no laser sensor).

Naman gravatar image Naman  ( 2015-03-24 15:00:01 -0600 )edit

OK. One thing to explore for the overall architecture: use robot_localization to take in sensor data and publish updates to pose using tf, with your map set as the world frame. Then implement some motion planning to command the robot. Path planning is where my knowledge breaks down, however.

aak2166 gravatar image aak2166  ( 2015-03-24 15:35:46 -0600 )edit

Also, you would have to configure all the these tools to work for your robot, ie convert your map into a costmap_2d. Etc.

aak2166 gravatar image aak2166  ( 2015-03-24 15:37:50 -0600 )edit

Question Tools

1 follower


Asked: 2015-03-24 13:47:55 -0600

Seen: 2,673 times

Last updated: Mar 25 '15