how to make end effector look at object?

So I wasted a lot of time on this and can't seem to solve it:

I have a UR10 robot which I want to move towards/approach an object. I use TF/TF2 to get the position/orientation of said object and give it to moveit with /compute_ik and joint spaces. Moving to that point given by TF isn't a problem. Problematic is though, I have no idea how to make the end effector LOOK at the object. Also I want the robot to approach the point rather than go into it (since its supposed to be an object in real life on a table). I don't know what kind of math to do here. Also, I am a complete beginner on ROS and didnt do math in a while. Sorry for the dumb question.

What I tried:

rotation = self.move_group.get_current_rpy()
quat_ee = tf.transformations.quaternion_from_euler(rotation[0], rotation[1], rotation[2])
quat_object = tf.transformations.quaternion_from_euler(self.rotation_object[0], self.rotation_object[1], self.rotation_object[2])

self.new_quat = quaternion_multiply(quat_ee, quat_object)


then I pass the new_quat to the orientation of the robot:

          self.pose_goal.pose.orientation = geometry_msgs.msg.Quaternion(*self.new_quat)


On an honest note, I have absolutely 0 clue if that makes even sense.

I use Python on ROS Melodic.

I would love to jump ships to C++ but Uni is kinda tough rn. If anyone has some clue I would appreciate it a lot if you could help me in Python!

edit retag close merge delete

to move the robot to a point in front of the object you can do object.position.x-offset & object.position.y-offset. orientation stays the same. Not sure if this answers your question, otherwise please clearify the question.

( 2021-04-23 09:08:32 -0600 )edit

Problematic with that is, that the orientation is from before the robot moves to the object, which means the end orientation would look the same as as initially planned but offset by x y and thus not looking at the object anymore. fvd did help me a lot with his answer! But many thanks tho

( 2021-04-23 15:43:57 -0600 )edit

Sort by » oldest newest most voted

It looks like you're on the right track, and what you wrote made sense. You need to apply vector math and rotations. Rotations can be a bit tricky to sort through, but they are not rocket science either. Just remember that the order (and side) of multiplication counts and work your way through it. It helps to visualize orientations with tools like this or this. Remember that you can and should visualize TF frames in Rviz.

Remember also that the goal_pose defines the position and orientation that your endEffectorLink will be at. I find it helpful to define a frame at the tip of the gripper and use that.

You already use tf.transformations and tf_conversions which are probably the most appropriate tools for rotations in ROS and Python. You can define an orientation in a local frame (e.g. table) and assign it to the goal pose to have a "known good" orientation for your end effector.

Remember that you can also chain together elementary rotations to obtain the one you want. For example, you can first find the rotation in the ground plane, and then the angle at which you might look down at the object. Or if you don't want to do the math, you can just define the orientation manually as explained above. That's what I usually do.

And remember that the roll-pitch-yaw convention applies in the non-rotated coordinate system (the default order in the tf libraries you are using).

Maybe this and this example code can be of help (not that it's best practice, but it's a point of reference).

Good luck!

more

Thanks for your answer! Knowing that I am on the right track is already a lot of help. Plus the code linked does help a lot too! I will keep trying!

( 2021-04-23 15:41:53 -0600 )edit
1

Okay sorry for double commenting/spaming but would you put that extra frame at the tip in the URDF or is it possible to do with static transform alone? EDIT: I just made it inside my urdf and set it over the moveit setup assistant to the endeffector and it works! Thanks alot! Of course its not perfect right now but I guess I can make it a bit further on my own now :)

( 2021-04-23 16:35:40 -0600 )edit