How to get absolute pose of the mobile robot using 2D Lidar and IMU?

asked 2023-03-08 22:43:47 -0500

Astronaut gravatar image

HI

I would like to get the absolute pose of the mobile robot using 2D Lidar scanning 360 degree and IMU. The robot is moving around in semi dense environment such store where are walls, shelvs, and dynamic objects such as humans. I would to develop algorithm to estimate the lateral position and heading angle of the robot. So the algorithm will includes following steps:

  1. Recognize the nearby “Drivable Area” by detecting walls, shelves, etc. a. Note that there’ll be some moving objects like humans. b. In some area, there may not be enough walls or shelves.

  2. Estimate the robot’s pose relative to the drivable area’s direction. It should include left and right width and heading angle.

  3. Update the robot’s estimated pose by fusing the result. a. Note that this method can’t estimate the longitudinal position. b. Link-Map will have information about the desired position in the isle(Here can use the visual SLAM for example ORB_SLAM2)

So the developing environment is ROS and Ubuntu. Also , no GPU to use. I know that Hector Mapping or robot_lokalization packages can prove pose estimation but still need to use my own algorithm . The code can be in python or C++. Any help or hints in starting the ROS nodes and coding?

Thanks

edit retag flag offensive close merge delete