# Integrate free space into octomap

Hello,

I am looking into how octomap integrates information from PointCloud2 into the Octree structure of octomap.
Specifically for free space coming from a LIDAR. Let's take the velodyne_pointcloud sensor simulator for example. Is there a way to take into account the laser beams that reported no obstacles as free space?

I give a more general explanation below. But after reading the code for the octomap_server, at OctomapServer.cpp does the last else of the insertScan method insert free space of rays that make it all the way through here? Am I correct in my interpretation that laser beams that have not reached any obstacle are marked as NaN?

I understand that for each point in the point cloud the space between the origin and the point is marked as free.
Is there a way to mark as free space where there is no obstacle to generate the point in the point cloud?
As there are few obstacles in the world I am using, the part of the world completely free stays as unexplored.

It's easier to see graphically.
Here is the occupied space with the ground, a wall, and a minor obstacle:

The robot is at the center of the purple "circle". Try to memorize the position relative to the wall - is several maximum resolution voxels away. The corresponding free space (from the same perspective) is here in green:

After paying a bit of attention, the image shows that there is much more explored, free space between the UAV and the wall than in any other direction.
This could just be because of the sensor capabilities/position. However, the sensor is a velodyne (360º sensing). Plus the robot had just spin on himself (yaw spin) to overcome any sensor issue.
Here is a closer view of the sensor:

I'm not sure if this is just octomap being designed to work only with the points of the point cloud or if there is some parameter that I tune to change the data integration behavior.

edit retag close merge delete

Sort by » oldest newest most voted

In case anyone has the same issue.
The octomap_server incorporated only free space when the distance from the origin (found by tf) and the point (in the point cloud) is larger than the maximum range.

I'm working with a simulated Hokuyo sensor (LaserScan). In this case, the laser beams that do not detect an obstacle have inf value. That translated into a point cloud is a NaN (not a very far away point).

My solution: fork the laser_geometry package and create a node that translates inf into a point at maximum range.
I case anybody needs it https://github.com/margaridaCF/laser_...
The repo where I am using this is https://github.com/margaridaCF/Flying... There is no directly launchable launch file because I'm using docker with px4 (which requires some setup). But the instructions are at the jupyter notebook on folder _generate_plots. Look at try 9.

The main things are that your launch has:
- <node name="free_cloud" type="free_cloud_node" pkg="laser_free_to_cloud"/>
- The topic for octomap_server is free_cloud and the maximum range declared here matches the definition in the model.

<node pkg="octomap_server" type="octomap_server_node" name="octomap_builder" output="screen">
<param name="frame_id" type="string" value="map" />
<remap from="cloud_in" to="/free_cloud" />
<param name="latch" value="false" />
<param name="publish_free_space" value="true"/>
<param name="sensor_model/max_range" value="10"/>
</node>

more

A bit late perhaps, but wouldn't #q209018 have helped here? Using laser_filters would remove the need for your fork, it would seem.

( 2018-04-12 11:11:24 -0500 )edit

It looks promising but I didn't know the package before :) If I end up switching it I'll post the alternative solution. I did feel strange that none had needed this before, a million times :)

( 2018-06-29 03:47:26 -0500 )edit