ros::moveit adding rotation to xyz-coordinates
Hello people,
I am currently programming a 6 DOF robotic arm using ros and moveit for the navigation. My goals are to send x,y,z coordinates as well as a fixed orientation of my gripper in order to succesfully grab certain objects.
Following the tutorial i can send x,y,z coordinates like this just fine:
geometry_msgs::Pose target_pose1;
target_pose1.orientation.w = 1;
target_pose1.position.x = 0;
target_pose1.position.y = 0;
target_pose1.position.z = 0.55;
move_group.setPoseTarget(target_pose1);
Using this method, my end effector stays in its start orientation.
I can also send a effector orientation like this:
move_group.setRPYTarget(1,1,0);
However, now i the orientation changes to the given values but the x,y,z-coordinates seem rather random.
Using both commands in a row, the sedond one overwrites the first one, leading to an insufficient result.
Right now I am stuck messing around with quaternion transformation using the tf package not knowing whether this is is the right way at all. Seems like this package is unable to combine both position and orientation as well.
Does somebody know a way to send x,y,z-coordinates AND rpy-orientations (c++)?
Thank you in advance,
Felix
Setting a pose target is the correct way to do it. However, you have to make sure of two things. First, the pose must be reachable but the end-effector. Secondly, the quaternion must be valid, and accurate to a fair number of decimal places (6+). Ensure both of these are satisfied, then report back.