ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Obtaining joint angles from joint TFs?

asked 2015-02-13 15:34:10 -0600

SaiHV gravatar image

Hi, I am currently using openni_tracker to tele-op a robot arm using an RGBD sensor. I am able to obtain the joint positions and orientations using openni_tracker, which publishes a TF with respect to the camera frame for every joint. I am getting confused when I think about how I can convert this into relative orientation of each joint, which I can then pass to t he robot control. For each joint, I am interested in two angles: the up-down motion angle and the sideways angle, and I feel that TF is giving me way more data than what's really relevant to me. Any suggestions will be very helpful.

Thanks, Sai

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2023-02-07 19:44:55 -0600

tfoote gravatar image

The infromation provided by TF is the beginning of what you need. Generally what you'll want to do is compute the target point for the arm or manipulator from the coordinate frames via the transforms as appropriate.

And then you'll want to take that target point and provide it to an inverse kinematic(IK) solver which will compute the position of the robot's arms which will achieve your goal. The IK will use your robot defintion to generate the joint positions.

And the more canonical way to do this is to also give it to a motion planner to compute the path from A to B.

edit flag offensive delete link more

Question Tools

Stats

Asked: 2015-02-13 15:34:10 -0600

Seen: 261 times

Last updated: Feb 07 '23