ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A answers.ros.org

# Range of acceptable positions relative to a link

Hi

I have a Gazebo simulation of a robot arm with a parallel gripper (PhantomX Pincher Arm). Because grasping objects in Gazebo is sadly not really realistic or easy, I want to check if the gripper is in the correct position, so that it would pick up the object if it would close the fingers.

I can retrieve in my (Python) program the position & orientation of the object I want to pick up (pickMeUp) through the /gazebo/link_state topic. This topic also gives me one of the two finger links of the gripper. I now would like to write a condition to check if the finger link is in a acceptable range relative to the link of the object to pick up (or vice versa of course). So something like this:

if(pose_gripper_link.x < pose_pickMeUp_link.x + distance_in_x-direction_of_pickMeUp and ... (other conditions)):
print("grasping would most likely be successful")


Here's a sketch of the situation: https://www.dropbox.com/s/ih20pt9h3z6...

How could I achieve this?

Note that I can only retrieve the Pose of pickMeUp via the /gazebo/link_state topic, as this link doesn't appear in my /tf-tree. I can get the links of the gripper through tf though, and get the transformation between the /world coordinate system and the gripper-link coordinate system with the following code:

listener = tf.TransformListener()
try:
(trans, rot) = listener.lookupTransform("/gripper_finger_base_link", "/world", rospy.Time())
except (tf.LookupException, tf.ConnectivityException) as e:
print(e)


But I don't know how I can apply this transformation & rotation to the Pose of pickMeUp, so that its Pose would be described in the /gripper_finger_base_link's coordinate system. I'm also not sure if this is of any use for me, as the /gazebo/link_states already gives me (one) gripper link relative to the /world coordinate system. I could also construct a unit vector (described in the /world coordinate system) in the direction of the x-axis of the pickMeUp link, in order to have something like the distance_in_x-direction_of_pickMeUp form above. This shouldn't be a big issue, but I don't want to reinvent the wheel if this functionality is already in tf (or somewhere else).

edit retag close merge delete

Sort by » oldest newest most voted

Here my solution. Because I had to restructure the gripper (another story), I couldn't even get one of the two finger links through the /gazebo/link_states topic, only another link which belonges to the gripper (arm_wrist_flex_link). With this position and the tf-tree transformation from the arm_wrist_flex_link to the gripper_active_link, I could calculate the position of the right gripper finger link (gripper_active_link) in the /world coordinate system. I then calculated the unit vectors of the PickMeUp object (described in the /world coordiante system). With these vectors, I was able to define a cuboid in the /world coordinate system, where the right gripper finger has to be in order to pick up the object. Finally I check if the position of the gripper is within this bounding box. If yes, the gripper is in the correct position.

This is how I ended up implementing (shortened):

object_positions = rospy.wait_for_message('/gazebo/link_states', LinkStates, timeout=5) # array of poses (= position & orientation of some objects/links)
indices = [i for i, s in enumerate(object_positions.name) if 'pickMeUp' in s]
indexOfPickMeUp = indices[0]
pickup_pose = object_positions.pose[indexOfPickMeUp]
pickup_position = pickup_pose.position

# We can only get the arm_wrist_flex_link via the /gazebo/link_states topic, so we get the pose of this link (which doesn't move relative to the finger links) and use this arm_wrist_flex_link pose to calculate the position of the two finger links
indices = [i for i, s in enumerate(object_positions.name) if 'arm_wrist_flex_link' in s]
indexOfGripper = indices[0]
arm_wrist_pose = object_positions.pose[indexOfGripper]
# Getting translations from arm_wrist_flex_link to gripper_active link (we don't care for rotation, as we are only interested in the position of the gripper_active link (and not its orientation as well)). As mentioned above, these two coordinate systems don't move relative to each other.
listener = tf.TransformListener()
try:
except:
pass
# calculating the unit vectors (described in the /world coordinate system) of the arm_wrist_flex_link
quaternion = arm_wrist_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# calculating the position of gripper_active_link (gripper_right_position, described in the world coordinate system)
gripper_right_position = np.matrix([[arm_wrist_pose.position.x],[arm_wrist_pose.position.y],[arm_wrist_pose.position.z]])
gripper_right_position = gripper_right_position + trans[0]*ex + trans[1]*ey + trans[2]*ez
print(gripper_right_position) # e.g. [[0.09] [0.013] [0.058]]

# calculating the unit vectors (described in the /world coordinate system) of the pickMeUp::link
quaternion = pickup_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey ...
more