Obtaining joint angles from joint TFs?
Hi, I am currently using opennitracker to tele-op a robot arm using an RGBD sensor. I am able to obtain the joint positions and orientations using opennitracker, which publishes a TF with respect to the camera frame for every joint. I am getting confused when I think about how I can convert this into relative orientation of each joint, which I can then pass to t he robot control. For each joint, I am interested in two angles: the up-down motion angle and the sideways angle, and I feel that TF is giving me way more data than what's really relevant to me. Any suggestions will be very helpful.
Thanks, Sai
Asked by SaiHV on 2015-02-13 16:34:10 UTC
Answers
The infromation provided by TF is the beginning of what you need. Generally what you'll want to do is compute the target point for the arm or manipulator from the coordinate frames via the transforms as appropriate.
And then you'll want to take that target point and provide it to an inverse kinematic(IK) solver which will compute the position of the robot's arms which will achieve your goal. The IK will use your robot defintion to generate the joint positions.
And the more canonical way to do this is to also give it to a motion planner to compute the path from A to B.
Asked by tfoote on 2023-02-07 20:44:55 UTC
Comments