Ask Your Question
0

Range of acceptable positions relative to a link

asked 2018-05-09 10:12:11 -0500

RedJohn gravatar image

updated 2018-05-09 10:37:21 -0500

Hi

I have a Gazebo simulation of a robot arm with a parallel gripper (PhantomX Pincher Arm). Because grasping objects in Gazebo is sadly not really realistic or easy, I want to check if the gripper is in the correct position, so that it would pick up the object if it would close the fingers.

I can retrieve in my (Python) program the position & orientation of the object I want to pick up (pickMeUp) through the /gazebo/link_state topic. This topic also gives me one of the two finger links of the gripper. I now would like to write a condition to check if the finger link is in a acceptable range relative to the link of the object to pick up (or vice versa of course). So something like this:

if(pose_gripper_link.x < pose_pickMeUp_link.x + distance_in_x-direction_of_pickMeUp and ... (other conditions)):
  print("grasping would most likely be successful")

Here's a sketch of the situation: https://www.dropbox.com/s/ih20pt9h3z6...

How could I achieve this?


Note that I can only retrieve the Pose of pickMeUp via the /gazebo/link_state topic, as this link doesn't appear in my /tf-tree. I can get the links of the gripper through tf though, and get the transformation between the /world coordinate system and the gripper-link coordinate system with the following code:

listener = tf.TransformListener()
  try:
    listener.waitForTransform("/gripper_finger_base_link", "/world", rospy.Time(), rospy.Duration(4.0))
    (trans, rot) = listener.lookupTransform("/gripper_finger_base_link", "/world", rospy.Time())
  except (tf.LookupException, tf.ConnectivityException) as e:
    print(e)

But I don't know how I can apply this transformation & rotation to the Pose of pickMeUp, so that its Pose would be described in the /gripper_finger_base_link's coordinate system. I'm also not sure if this is of any use for me, as the /gazebo/link_states already gives me (one) gripper link relative to the /world coordinate system. I could also construct a unit vector (described in the /world coordinate system) in the direction of the x-axis of the pickMeUp link, in order to have something like the distance_in_x-direction_of_pickMeUp form above. This shouldn't be a big issue, but I don't want to reinvent the wheel if this functionality is already in tf (or somewhere else).

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2018-07-11 07:54:02 -0500

RedJohn gravatar image

updated 2018-07-11 07:59:53 -0500

Here my solution. Because I had to restructure the gripper (another story), I couldn't even get one of the two finger links through the /gazebo/link_states topic, only another link which belonges to the gripper (arm_wrist_flex_link). With this position and the tf-tree transformation from the arm_wrist_flex_link to the gripper_active_link, I could calculate the position of the right gripper finger link (gripper_active_link) in the /world coordinate system. I then calculated the unit vectors of the PickMeUp object (described in the /world coordiante system). With these vectors, I was able to define a cuboid in the /world coordinate system, where the right gripper finger has to be in order to pick up the object. Finally I check if the position of the gripper is within this bounding box. If yes, the gripper is in the correct position.

This is how I ended up implementing (shortened):

object_positions = rospy.wait_for_message('/gazebo/link_states', LinkStates, timeout=5) # array of poses (= position & orientation of some objects/links)
indices = [i for i, s in enumerate(object_positions.name) if 'pickMeUp' in s]
indexOfPickMeUp = indices[0]
pickup_pose = object_positions.pose[indexOfPickMeUp]
pickup_position = pickup_pose.position

# We can only get the arm_wrist_flex_link via the /gazebo/link_states topic, so we get the pose of this link (which doesn't move relative to the finger links) and use this arm_wrist_flex_link pose to calculate the position of the two finger links
indices = [i for i, s in enumerate(object_positions.name) if 'arm_wrist_flex_link' in s]
indexOfGripper = indices[0]
arm_wrist_pose = object_positions.pose[indexOfGripper]
# Getting translations from arm_wrist_flex_link to gripper_active link (we don't care for rotation, as we are only interested in the position of the gripper_active link (and not its orientation as well)). As mentioned above, these two coordinate systems don't move relative to each other.
listener = tf.TransformListener()
try:
   listener.waitForTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time(), rospy.Duration(4.0))
   (trans, rot) = listener.lookupTransform("/arm_wrist_flex_link", "/gripper_active_link", rospy.Time())
except:
   pass
# calculating the unit vectors (described in the /world coordinate system) of the arm_wrist_flex_link
quaternion = arm_wrist_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey = np.dot(rot_mat, np.matrix([[0], [1], [0], [1]]))
ey = ey[:3] / ey[3]
ez = np.dot(rot_mat, np.matrix([[0], [0], [1], [1]]))
ez = ez[:3] / ez[3]

# calculating the position of gripper_active_link (gripper_right_position, described in the world coordinate system)
gripper_right_position = np.matrix([[arm_wrist_pose.position.x],[arm_wrist_pose.position.y],[arm_wrist_pose.position.z]])
gripper_right_position = gripper_right_position + trans[0]*ex + trans[1]*ey + trans[2]*ez
print(gripper_right_position) # e.g. [[0.09] [0.013] [0.058]]

# calculating the unit vectors (described in the /world coordinate system) of the pickMeUp::link
quaternion = pickup_pose.orientation
explicit_quat = [quaternion.x, quaternion.y, quaternion.z, quaternion.w]
roll, pitch, yaw = tf.transformations.euler_from_quaternion(explicit_quat)
rot_mat = tf.transformations.euler_matrix(roll, pitch, yaw)
ex = np.dot(rot_mat, np.matrix([[1], [0], [0], [1]]))
ex = ex[:3] / ex[3]
ey ...
(more)
edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

2 followers

Stats

Asked: 2018-05-09 10:12:11 -0500

Seen: 102 times

Last updated: Jul 11 '18