1 | initial version |

Adrian, I got it to work by excluding the update step. I can see both distance and closest points on each object.

```
fcl::DistanceResult dist_result
double d = fcl::distance(o1, o2, fcl::DistanceRequest(true), dist_result);
std::cout << "d: " << d << std::endl <<
"p1: " << dist_result.nearest_points[0] << std::endl <<
"p2: " << dist_result.nearest_points[1] << std::endl;
```

This will return something similar to

```
d: 0.33571
p1: (1.21739 6.7476e-16 1.22103)
p2: (0.908283 6.76542e-16 1.352)
```

I am not 100% sure yet what frame the points are given in, but given the values it looks like the current world frame. Remember that each CollisionObject that is getting passed into this function contains both geometry and transform data, and my current assumption is that the transform of each object is in the current world frame that MoveIt is using.

2 | No.2 Revision |

Adrian, I got it to work by excluding the update step. I can see both distance and closest points on each object.

```
fcl::DistanceResult dist_result
double d = fcl::distance(o1, o2, fcl::DistanceRequest(true), dist_result);
std::cout << "d: " << d << std::endl <<
"p1: " << dist_result.nearest_points[0] << std::endl <<
"p2: " << dist_result.nearest_points[1] << std::endl;
```

This will return something similar to

```
d: 0.33571
p1: (1.21739 6.7476e-16 1.22103)
p2: (0.908283 6.76542e-16 1.352)
```

I am not 100% sure yet what frame the points are given in, but given the values it looks like the current world frame. Remember that each CollisionObject that is getting passed into this function contains both geometry and transform data, and my current assumption is that the transform of each object is in the current world frame that MoveIt is using.

Update Oct29-2013:

The points are given in the planning frame. This should be the base frame of the robot, but it can be changed by adding a virtual frame in the robot srdf that is used by MoveIt to create the planning scene.

ROS Answers is licensed under Creative Commons Attribution 3.0 Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.