ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

how to make end effector look at object?

asked 2021-04-22 14:46:00 -0500

hopestartswithu gravatar image

updated 2022-05-28 17:07:06 -0500

lucasw gravatar image

So I wasted a lot of time on this and can't seem to solve it:

I have a UR10 robot which I want to move towards/approach an object. I use TF/TF2 to get the position/orientation of said object and give it to moveit with /compute_ik and joint spaces. Moving to that point given by TF isn't a problem. Problematic is though, I have no idea how to make the end effector LOOK at the object. Also I want the robot to approach the point rather than go into it (since its supposed to be an object in real life on a table). I don't know what kind of math to do here. Also, I am a complete beginner on ROS and didnt do math in a while. Sorry for the dumb question.

What I tried:

rotation = self.move_group.get_current_rpy()
            quat_ee = tf.transformations.quaternion_from_euler(rotation[0], rotation[1], rotation[2])
            quat_object = tf.transformations.quaternion_from_euler(self.rotation_object[0], self.rotation_object[1], self.rotation_object[2])

            self.new_quat = quaternion_multiply(quat_ee, quat_object)

then I pass the new_quat to the orientation of the robot:

          self.pose_goal.pose.orientation = geometry_msgs.msg.Quaternion(*self.new_quat)

On an honest note, I have absolutely 0 clue if that makes even sense.

I use Python on ROS Melodic.

I would love to jump ships to C++ but Uni is kinda tough rn. If anyone has some clue I would appreciate it a lot if you could help me in Python!

edit retag flag offensive close merge delete

Comments

to move the robot to a point in front of the object you can do object.position.x-offset & object.position.y-offset. orientation stays the same. Not sure if this answers your question, otherwise please clearify the question.

crnewton gravatar image crnewton  ( 2021-04-23 09:08:32 -0500 )edit

Problematic with that is, that the orientation is from before the robot moves to the object, which means the end orientation would look the same as as initially planned but offset by x y and thus not looking at the object anymore. fvd did help me a lot with his answer! But many thanks tho

hopestartswithu gravatar image hopestartswithu  ( 2021-04-23 15:43:57 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
2

answered 2021-04-23 09:48:00 -0500

fvd gravatar image

It looks like you're on the right track, and what you wrote made sense. You need to apply vector math and rotations. Rotations can be a bit tricky to sort through, but they are not rocket science either. Just remember that the order (and side) of multiplication counts and work your way through it. It helps to visualize orientations with tools like this or this. Remember that you can and should visualize TF frames in Rviz.

Remember also that the goal_pose defines the position and orientation that your endEffectorLink will be at. I find it helpful to define a frame at the tip of the gripper and use that.

You already use tf.transformations and tf_conversions which are probably the most appropriate tools for rotations in ROS and Python. You can define an orientation in a local frame (e.g. table) and assign it to the goal pose to have a "known good" orientation for your end effector.

Remember that you can also chain together elementary rotations to obtain the one you want. For example, you can first find the rotation in the ground plane, and then the angle at which you might look down at the object. Or if you don't want to do the math, you can just define the orientation manually as explained above. That's what I usually do.

And remember that the roll-pitch-yaw convention applies in the non-rotated coordinate system (the default order in the tf libraries you are using).

Maybe this and this example code can be of help (not that it's best practice, but it's a point of reference).

Good luck!

edit flag offensive delete link more

Comments

Thanks for your answer! Knowing that I am on the right track is already a lot of help. Plus the code linked does help a lot too! I will keep trying!

hopestartswithu gravatar image hopestartswithu  ( 2021-04-23 15:41:53 -0500 )edit
1

Okay sorry for double commenting/spaming but would you put that extra frame at the tip in the URDF or is it possible to do with static transform alone? EDIT: I just made it inside my urdf and set it over the moveit setup assistant to the endeffector and it works! Thanks alot! Of course its not perfect right now but I guess I can make it a bit further on my own now :)

hopestartswithu gravatar image hopestartswithu  ( 2021-04-23 16:35:40 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2021-04-22 14:46:00 -0500

Seen: 732 times

Last updated: Apr 23 '21