ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

So I have asked this question about 3.5 years ago. I think the answer is "No, ROS will not perform any kind of motion compensation or de-warping on lidar scans that were taken on a moving base". In my experience, gmapping and hector_slam get confused when the robot is moving at anything except snail's pace. Especially when the robot is turning.

Just to illustrate what I'm talking about:

When my robot (red arrow) is stationary in the corner of a hallway, this is what the lidar scan looks like. Very much as expected.

image description

Once the robot starts rotating in place, i.e. turning, the data looks like this however.

image description

Am I the only one having this issue?

Granted, the above example of continuously turning in place is a bit of an extreme case, for the purpose of illustrating the problem and for the purpose of evaluating solutions, but the fact of the matter is that algorithms like hector slam and gmapping don't like the distorted scan data at all. Rotational motion is in fact the biggest issue. Translation (i.e. just going straight) does not cause nearly as much confusion in those algorithms.

How to de-warp?

The basic idea is to look at the robot's motion, i.e. pose as a function of time, and use this information to de-warp the warped "image".

So I have asked this question about 3.5 years ago. I think the answer is "No, ROS will not perform any kind of motion compensation or de-warping on lidar scans that were taken on a moving base". In my experience, gmapping and hector_slam get confused when the robot is moving at anything except snail's pace. Especially when the robot is turning.

Just to illustrate what I'm talking about:

When my robot (red arrow) is stationary in the corner of a hallway, this is what the lidar scan looks like. Very much as expected.

image description

Once the robot starts rotating in place, i.e. turning, the data looks like this however.

image description

Am I the only one having this issue?

Granted, the above example of continuously turning in place is a bit of an extreme case, for the purpose of illustrating the problem and for the purpose of evaluating solutions, but the fact of the matter is that algorithms like hector slam and gmapping don't like the distorted scan data at all. Rotational motion is in fact the biggest issue. Translation (i.e. just going straight) does not cause nearly as much confusion in those algorithms.

How to de-warp?

The basic idea is to look at the robot's motion, i.e. pose as a function of time, and use this information to de-warp the warped "image".

The output of my solution looks like this:

C:\fakepath\dewarped-with-motion-1a.png

So I have asked this question about 3.5 years ago. I think the answer is "No, ROS will not perform any kind of motion compensation or de-warping on lidar scans that were taken on a moving base". In my experience, gmapping and hector_slam get confused when the robot is moving at anything except snail's pace. Especially when the robot is turning.

Just to illustrate what I'm talking about:

When my robot (red arrow) is stationary in the corner of a hallway, this is what the lidar scan looks like. Very much as expected.

image description

Once the robot starts rotating in place, i.e. turning, the data looks like this however.

image description

Am I the only one having this issue?

Granted, the above example of continuously turning in place is a bit of an extreme case, for the purpose of illustrating the problem and for the purpose of evaluating solutions, but the fact of the matter is that algorithms like hector slam and gmapping don't like the distorted scan data at all. Rotational motion is in fact the biggest issue. Translation (i.e. just going straight) does not cause nearly as much confusion in those algorithms.

How to de-warp?

The basic idea is to look at the robot's motion, i.e. pose as a function of time, and use this information to de-warp the warped "image".

The output of my solution looks like this:

C:\fakepath\dewarped-with-motion-1a.png

this (white is the output, yellow is the input):

image description

Updated answer: I think what I'm looking for is LaserProjection::transformLaserScanToPointCloud() from the laser_geometry package.

Old answer: So I have asked this question about 3.5 years ago. I think the answer is "No, ROS will not perform any kind of motion compensation or de-warping on lidar scans that were taken on a moving base". In my experience, gmapping and hector_slam get confused when the robot is moving at anything except snail's pace. Especially when the robot is turning.

Just to illustrate what I'm talking about:

When my robot (red arrow) is stationary in the corner of a hallway, this is what the lidar scan looks like. Very much as expected.

image description

Once the robot starts rotating in place, i.e. turning, the data looks like this however.

image description

Am I the only one having this issue?

Granted, the above example of continuously turning in place is a bit of an extreme case, for the purpose of illustrating the problem and for the purpose of evaluating solutions, but the fact of the matter is that algorithms like hector slam and gmapping don't like the distorted scan data at all. Rotational motion is in fact the biggest issue. Translation (i.e. just going straight) does not cause nearly as much confusion in those algorithms.

How to de-warp?

The basic idea is to look at the robot's motion, i.e. pose as a function of time, and use this information to de-warp the warped "image".

The output of my solution looks like this (white is the output, yellow is the input):

image description