Ask Your Question
0

creating the map with only Lidar

asked 2021-12-08 01:33:40 -0500

updated 2021-12-08 02:04:26 -0500

I am using, ROS Noetic + 20.04. Lidar hardware: rplidar_a3m1

  • I installed slam_toolbox package via sudo apt install ros-noetic-slam-toolboxand it is working fine
  • I installed rplidar driver via rplidar git

I want to create an indoor map using slam_toolbox with lidar ONLY. I do not have wheel odometry.

First I do,

  • roslaunch rplidar_ros rplidar_a3.launch This starts publishing on the /scan topic.
  • roslaunch slam_toolbox online_sync.launch I launch the slam_toolbox with default config.

Initially, I encounter that map did not grow after its initialization #438

Then, I provided a dynamic frame as per Broadcasting a moving frame. Which looks like this tf publisher

But I see my map is not generated correctly. It is moving in a circle (which I guess due to the dynamic tf I provided). The circle get smaller if I reduce the amplitude. I do not know what is the correct way of providing the dynamic tf between frames when I want to move the sensor with hand and create a map.

map

And this is how my tf tree looks like when running, rosrun tf2_tools view_frames.py

frames.pdf

Regards,

edit retag flag offensive close merge delete

Comments

Try having a look at hector_slam. It has worked well for me in the past.

WarTurtle gravatar image WarTurtle  ( 2021-12-08 02:12:03 -0500 )edit

@WarTurtle Thanks for commenting. Would you recommend hector_slam as better solution than slam_toolbox?. As hector_slam does not provide loop closing ability

amjack gravatar image amjack  ( 2021-12-08 04:33:34 -0500 )edit

I'm not sure that what you want to do is possible. All the SLAM algorithms I've looked at want a half-way accurate estimated robot pose for each scan they record. This value is normally constructed using measurements from sensor(s).

Mike Scheutzow gravatar image Mike Scheutzow  ( 2021-12-08 15:29:17 -0500 )edit

@Mike Scheutzow

I would like to generate the map of an indoor environment. I am not trying to localize the sensor, so I believe Odometry is not required. Please correct me if miss something.

amjack gravatar image amjack  ( 2021-12-09 03:27:52 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2021-12-08 17:33:43 -0500

Geoff gravatar image

You need some source of odometry information for the SLAM algorithm to work. Just laser scans is not sufficient.

Fortunately, it is possible (although not as accurate) to calculate an odometry estimate from the laser scans. Here are some packages that do this:

Use one of these to calculate an odometry estimate and feed that into the TF tree to provide the pose of each laser scan.

edit flag offensive delete link more

Comments

Thanks for your reply, But I suspect laser_scan_matcher is not a maintained repository. Although, I tried hector_slam and was able to localize the lidar sensor.

amjack gravatar image amjack  ( 2021-12-10 05:10:45 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2021-12-08 01:33:40 -0500

Seen: 88 times

Last updated: Dec 08 '21