# Removing Single Points from a sensor_msgs::PointCloud2

Hi,

I am trying to build a dynamic mask for robot systems using a Kinect2. My Idea is to use a naive euklidean distance search to remove the robot from the world. (I want to have a hole in the cloud where the robot is)

Unfortunately I don't know how to do that. Can you help me?

This is the part that needs help:

for (sensor_msgs::PointCloud2ConstIterator<float> it(output, "x"); it != it.end(); ++it) {
// TODO: do something with the values of x, y, z

try {
transformStamped = tfBuffer.lookupTransform("kinect2_rgb_optical_frame", "RobotCenter",ros::Time(0),
ros::Duration(2));

CamRobX = transformStamped.transform.translation.x;
CamRobY = transformStamped.transform.translation.y;
CamRobZ = transformStamped.transform.translation.z;
CamPointX = it[0];
CamPointY = it[1];
CamPointZ = it[2];

distance = sqrt(pow((CamPointX - CamRobX), 2) + pow((CamPointY - CamRobY), 2) + pow((CamPointZ - CamRobZ), 2));

}
catch (tf2::TransformException &ex) {
ROS_WARN("%s", ex.what());
ros::Duration(1.0).sleep();
continue;
}

// distance refers to a radius of a frame in the middle of the room everything around this frame has to dissapear

if(distance <0.4){

}

else{
}
}

pub.publish(output);

edit retag close merge delete

Do you absolutely want to solve/implement this yourself, or would a pkg/plugin that does this also be ok?

( 2019-02-20 13:06:25 -0500 )edit

Sort by » oldest newest most voted

Given you're written the majority of the code for this it's only a tiny bit of work to get this working. Just to give a bit of background, there are two different types of point cloud that can be represented by PCL: Dense structured point clouds and sparse point clouds.

In dense point clouds, the points are stored in a grid structure a NxM matrix of 3D points with invalid points being represented as vectors of NANs. These types of point clouds are generally created by depth cameras and stereo vision algorithms.

In sparse point clouds, the points are represented as a flat list of points in no specific order these are more commonly generated by LIDAR sensors.

Getting back to your question, the kinect produces dense structured point clouds so you will be able to 'remove' a point by simply setting its x, y and z values to NAN. This means that the grid structure of the points is preserved.

Hope this helps.

Update

You can set the values of a point using the iterator you are using in your loop as shown below:

it[0] = std::numeric_limits<float>::quiet_NaN();
it[1] = std::numeric_limits<float>::quiet_NaN();
it[2] = std::numeric_limits<float>::quiet_NaN();


The iterator functions like a pointer to the pcl::Point allowing you to get and set its values.

more

That clears things up, thank you! Unfortunately I don't know how and where to set the NAN-Value also, this is already a sensor_msgs_PointCloud2. Can I do the same here?

( 2019-02-21 07:09:48 -0500 )edit

Yes this should work. I've updated my answer with an example for you now.

( 2019-02-21 07:24:24 -0500 )edit

I tried that, but when I try to compile (catkin_make) the following error code appears. it seems like the iterator won't let me override the values for some reason although it should...

( 2019-02-21 07:46:41 -0500 )edit

error: assignment of read-only location ‘it.sensor_msgs::PointCloud2ConstIterator<float>::<anonymous>.sensor_msgs::impl::PointCloud2IteratorBase<float, const="" float,="" const="" unsigned="" char,="" const="" sensor_msgs::pointcloud2_<std::allocator<void=""> >, sensor_msgs::PointCloud2ConstIterator>::operator

( 2019-02-21 07:47:53 -0500 )edit

the iterator won't let me override the values for some reason although it should...

no, it shouldn't. It's a ConstIterator ..

( 2019-02-21 07:49:44 -0500 )edit

So, how can I circumvent this?

( 2019-02-21 07:52:02 -0500 )edit

Ahh if your working with the message passed to a callback it will be a const (immutable) type. You should be able to create a non-const copy of the message and work on that:

sensor_msgs::PointCloud2 newCloud = *cloud_parameter;

( 2019-02-21 08:01:16 -0500 )edit

Thank you all for your help! Now I know what I did wrong. First off: Yes Changing Single Coordinates to NAN works, but like somebody else pointed out, I used the wrong Iterator. Thankfully There was another Iterator which doesn't have the "const" tag in it, and with this one I am able to change points.

Here is the improved code:

for (sensor_msgs::PointCloud2Iterator<float> it(output, "x"); it != it.end(); ++it) {
// TODO: do something with the values of x, y, z

try {
transformStamped = tfBuffer.lookupTransform("kinect2_rgb_optical_frame", "RobotCenter",ros::Time(0),
ros::Duration(2));

CamRobX = transformStamped.transform.translation.x;
CamRobY = transformStamped.transform.translation.y;
CamRobZ = transformStamped.transform.translation.z;
CamPointX = it[0];
CamPointY = it[1];
CamPointZ = it[2];

distance = sqrt(pow((CamPointX - CamRobX), 2) + pow((CamPointY - CamRobY), 2) + pow((CamPointZ - CamRobZ), 2));

}
catch (tf2::TransformException &ex) {
ROS_WARN("%s", ex.what());
ros::Duration(1.0).sleep();
continue;
}
if(distance <0.4){

it[0] = std::numeric_limits<float>::quiet_NaN();
it[1] = std::numeric_limits<float>::quiet_NaN();
it[2] = std::numeric_limits<float>::quiet_NaN();

}

else{
}
}

pub.publish(output);

more