Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Hi, I've got pretty much the same problem as you. But I think you need to transform from your sensor frame to the map frame, not in the other direction. Because /map is what is getting displayed in the end. Hope we get some help, I did not understand how to use the tutorials in my case too.

Greets

Hi, I've got pretty much the same problem as you. But I think you need to transform from your sensor frame to the map frame, not in the other direction. Because /map is what is getting displayed in the end. Hope we get some help, I did not understand how to use the tutorials in my case too.

Greets

Edit:

Yesterday I got my mapping to work. Here are the steps I took:

I've created a transform publisher, which provides a tf. It describes how to transform from my_moving_frame to my_fixed_frame. As I have variable transformings (my robot is being localized externally) I use the variables for my arguments.

        br.sendTransform((x, y, z), tf.transformations.quaternion_from_euler(0, 0, k),
                 rospy.Time(0),
                 "my_moving_frame",
                 "my_fixed_frame")

Then I provided a pointcloud2 with a corresponding node to the topic /narrow_stereo/points_filtered2 like it's configured in the octomap_mapping.launch file.

In the providers for the transform and the pointcloud you need to pay attention to the timing. Use rospy time (0) instead of time.now...

Then you fire up your tf-provider, then your pcl2 provider, then you launch octomap_mapping.launch, then rviz rviz and display your map with the octomap-plugins (in the menu of rviz in the top of the application).

Hope this rough description gets you further.