point cloud is transformed wrong
Hi all,
I'm not sure what information would be sufficient for this question, so please feel free to ask about any details.
The problem is as follows:
I have recorded a dataset with kinect, based on the PR2, which provides the tf topic.
I then construct the point clouds from depth images and want to translate those point clouds into the world frame.
I know, that the world frame is represented by /odom_combined
and camera frame by /head_mount_kinect_rgb_link
, so I lookup the transform:
listener.lookupTransform("/head_mount_kinect_rgb_link", "/odom_combined", time, transform);
And apply it to the point cloud via
pcl_ros::transformPointCloud(*filteredCloud,*filteredCloud,transform);
This works, but gives some strange results. As I have recorded the dataset with the robot, I definitely know how it looks and that the camera was only rotating with the robot base, so the floor should normally stay horizontal, while in my case whenever the robot turns the floor starts turning so it starts to look like a wall and then eventually as ceiling. I know this may be a bad way to explain the situation, but for now it's the best I can do.
Does anyone have any idea what happens there and what can be done to avoid keep the floor horizontal?
Thanks in advance.
UPD: It seems like the wrong rotation is performed on the point cloud. So, when the robot turns around the y axes the point cloud is turned around the x (??) axes. And that seems to be the problem. Could maybe anyone suggest how to deal with that?
UPD2: Added the tf_frames.pdf on my google drive