Ask Your Question
1

octomap pointcloud accumulation problem

asked 2018-04-26 08:42:57 -0500

RoboRos gravatar image

updated 2018-05-05 09:33:19 -0500

Hello,

I am trying to do mapping () using hokuyo3d lidar. And it works absolutely fine and produce voxels very efficiently.

Problem: On moving lidar (transformation and/or rotation) octomap keep populating the map in the same place / . Thus it accumulate everything in the same region, hence creating a mess. (USING IMU TOO)

Thus, it seems that it's not able to find any transformations. Can someone please guide how to make octomap aware of transformations or rotations??

Attached image portrays my problem.

image description

My launch file looks like:

<node pkg="octomap_server" type="octomap_server_node" name="octomap_server">
    <param name="resolution" value="0.05" />

    <!-- fixed map frame (set to 'map' if SLAM or localization running!) -->
    <param name="frame_id" type="string" value="hokuyo3d_link" />

    <!-- maximum range to integrate (speedup!) -->
    <param name="sensor_model/max_range" value="5.0" />

    <!-- data source to integrate (PointCloud2) -->
    <remap from="cloud_in" to="/hokuyo3d/hokuyo_cloud2" />

</node>

I am using hector slam with octomap to get the project done. But somehow octomap seems something missing to keep track of the trajectory. Here's my tf tree below: image description

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
1

answered 2018-04-26 16:57:24 -0500

ufr3c_tjc gravatar image

updated 2018-05-07 16:00:36 -0500

When you move the lidar unit, ROS needs to know where it has been moved to. This is exactly what the transform tells the ROS system: the location and orientation of the link at each point in time. You cannot simply walk around a room with the lidar and create a map in ROS. You need something telling ROS where the lidar is at each point in time. This is the transform's job. I think you would benefit from doing some reading on transforms here.

You could look at a SLAM-based solution, but many of these require some odometry from wheel encoders or the like. I know that Hector SLAM doesn't require odometry, but will only work if you move the lidar in a 2D plane. I'm not aware of any 3D pointcloud-based SLAM solutions that don't require odometry, but they might exist. LOAM Velodyne seems hopeful but does require an IMU.

edit flag offensive delete link more

Comments

Hi, Thanx 4 ur respose. I am already using an imu. According to octomap documentation, one need to have some localization running: https://groups.google.com/forum/#!msg... . I am using hector slam but no luck yet. Can you please guide.

RoboRos gravatar imageRoboRos ( 2018-04-28 16:29:54 -0500 )edit
0

answered 2018-04-26 16:49:23 -0500

robotchicken gravatar image

From the octomap server ROS wiki,

cloud_in (sensor_msgs/PointCloud2) Incoming 3D point cloud for scan integration. You need to remap this topic to your sensor data and provide a tf transform between the sensor data and the static map frame.

This transform is usually obtained using odometry(IMU, wheel encoders, visual odomerty, etc).

edit flag offensive delete link more

Comments

Hi, Thanx 4 ur respose. I am already using an imu. According to octomap documentation, one need to have some localization running: https://groups.google.com/forum/#!msg.. . . I am using hector slam but no luck yet.

RoboRos gravatar imageRoboRos ( 2018-04-28 16:33:38 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

2 followers

Stats

Asked: 2018-04-26 08:42:57 -0500

Seen: 771 times

Last updated: May 07 '18