# transform LaserScan to /world_frame

Hi all,

I'm trying to transform a LaserScan(already transformed to PCL::PointCloud) from it's frame to the /world frame.

What I'm doing right now is accumulate the pointcloud in the /world frame, while my robot moves(the information of the egomotion, gives the position of the robot in the world and it's orientation). I get the orientation using an imu and the position through wheel odometry.

The transformation(or pointcloud accumulation) I get between the /laser frame and the /world frame, is working correctly for the cartesian coordinates, however when the robot suffers any rotation(pitch and roll),due to any inclination of the ground,the representation of the pointcloud, is not beeing represented on the /world frame, it's representation rotates just like the robot(and was expected that the points would be right over the world).

I have my laser pointing to the ground, and when the robot is moving in plane ground, the reconstruction is exactly as it is in reality, my problem is when the robot start curving(it has suspension).

(The axes you see in the image is from the laser, the red grid is the world)

my tf tree:

/world->/base_footprint-> ... -> /laser_frame

I hope I could make myself clear, it's not easy to explain. I dont know what I'm doing wrong, if anyone could help me I would appreciate. :D

p.s. Sorry for so much text.

edit retag close merge delete

Do you convert the cloud to the world frame before or after you accumulate it?

( 2012-04-13 03:39:49 -0500 )edit

before the accumulation

( 2012-04-13 05:04:54 -0500 )edit

So your scanner is producing LaserScan messages in the laser_frame TF frame. Did you try visualizing the LaserScan directly in RViz, before accumulation + transformation to a point cloud? If the raw scan looks right, you can rule out problems with the IMU.

( 2012-04-13 22:01:43 -0500 )edit

If I visualize the Laserscan directly on rviz, the representation is similar.. i dont know what i'm doing wrong

( 2012-04-16 01:04:27 -0500 )edit

That's strange. Are you sure you are only visualizing the last scan? What exactly is wrong? Because when the robot tilts, the laser scan of course also records a tilted line. Could you post another picture or video?

( 2012-04-16 02:24:27 -0500 )edit

For example, when the car is curving to the right, since it has suspension, it will roll as shown in the picture, however, the road was plane and I was expecting that the pointcloud accumulation would "draw" the road on the red grid (/world).

( 2012-04-16 03:51:26 -0500 )edit

This might be a bit off track but what do you mean by "accumulate" the point cloud?

( 2014-07-24 08:18:06 -0500 )edit

Sort by » oldest newest most voted

I would suspect that your IMU data is not modifying the transform tree correctly. You should check that the right effect is created.

It is very important to understand how your IMU works. Some IMUs calculate angular rotation with respect to the Earth. For example, an IMU that is world-referenced would obtain its z-rotation using a magnetometer. It would then use the gravity vector to calculate all rotations with respect to the Earth. A relative-reference IMU would simply calculate the amount that it has rotated about each axis since the last time you reset its gyros. Thus, if you reset the gyros with the robot upside down, that would be considered 0,0,0 rpy.

For example, on my robot, we use the IMU data (relative-reference) to calculate the transform from /base_footprint to /base_link. We mark /base_footprint as the point on the ground directly below the robot's center of mass/rotation. Thus, /base_footprint is modified by the position of the robot with respect to the static /odom frame. We obtain our z-axis rotation for /odom by resetting the gyros (so the IMU is not outputting 0 degrees for rotation about the z-axis) and then adding the heading obtained by our digital magnetometers. We then obtain x and y positions from the GPS. This sets our /odom frame with respect to the world (Earth).

After setting /odom with respect to the /world frame, all future calculations are performed between /odom and /base_footprint. This is because both wheel odometry and our IMU are relative to the starting point of the robot (/odom). Thus, the z-axis rotation from the IMU will set the heading of the robot with respect to /odom.

/base_link represents the center of mass/rotation for the robot. This frame's z-distance from /base_footprint is static (the robot cannot move up and down). This frame's rotation about the z-axis is fixed to /base_footprint's rotation. Finally, this frame's pitch (rotation about the y-axis) and roll (rotation about the x-axis) are calculated by the IMU.

I know this is very specific to my robot, but the idea is to give you an idea of the thought process you can use to check your IMU calculations. I'd be happy to clarify any points if necessary. I know this was a lot of information to understand.

more

Ok. I have two questions.

( 2012-04-13 10:29:24 -0500 )edit

What are they?

( 2012-04-13 11:27:54 -0500 )edit

For example, when the car is curving to the right, since it has suspension, it will roll as shown in the picture, however, the road was plane and I was expecting that the pointcloud accumulation would "draw" the road on the red grid (/world).

more

Sorry for the previous post.

1ºQuestion: What do you mean by "modifying the transform tree correctly"?

2ºQuestion: Maybe my problem is that I'm missing a frame or my understanding of /world frame is wrong.

Ok, I'll explain now what I've done. Through the WheelOdometry I calculate the robot pose in the world (I dont define any transformation at this point), I then to fuse the information of WheelOdometry with the IMU(Xsens), use Robot_pose_EKF (this package is responsible to send out the transformation between /base_footprint and /world).

Simplified tf tree:

/world ->/base_footprint

/base_footprint -> /laser

/base_footprint -> /imu

Defined the tf tree.

I :

p_listener->lookupTransform("/world","laser_frame_id", ros::Time(0), transform);

pcl_ros::transformPointCloud(pcl_pc_laser,pcl_transformed,transform);

pcl_acc+=pcl_transformed;

I thought this would enable me to accumulate correctly the pointCloud in the /world frame. I dont know If this problem is because of the transformation itself, of it is wrong estimation of the pose.

more

Just a few quick comments: 1. you shouldn't use ros::Time(0), but the time the laser scan is taken (i.e., laserscan.header.stamp) 2. you shouldn't hard code "laser_frame_id", but instead use laserscan.header.frame_id 3. do you run lookupTransform for each scan? You should.

( 2012-04-16 02:24:53 -0500 )edit

1º solved, you're right

2º It's not hardcoded, it was just to show in here

3º the code shown, is always called when a message of LaserScan is received

( 2012-04-16 03:58:10 -0500 )edit

For example, when the car is curving to the right, since it has suspension, it will roll as shown in the picture, however, the road was plane and I was expecting that the pointcloud accumulation would "draw" the road on the red grid (/world).

( 2012-04-16 07:17:52 -0500 )edit