Orientation Constraints with no Position Constraint

asked 2021-01-19 22:10:18 -0500

DrewHamilton gravatar image

updated 2021-01-23 08:27:25 -0500

Hi,

I am working on a homing routine for my robot arm in which I aim to keep the last few links of the arm in a vertical orientation, while one joint rotates to a target position. My intention is to plan a homing routine which maintains a near vertical orientation for the entire arm while each of the individual joints home themselves, checking for collision based on imported planning scene geometry.

I have attempted to use moveit_msgs::OrientationConstraint to set an orientation. I'm not wild about quaternions, so I've been converting from Euler angles or creating a quaternion from an axis and an angle. I am fairly certain my inputs are correct, but I am getting errors telling me something along the lines of:

Orientation constraint violated for link 'link6'. Quaternion desired: 0.707108 -0.000004 0.000004 0.707105, quaternion actual: 0.056829 0.275770 0.939794 0.193669, error: x=1.037871, y=0.215301, z=0.347481, tolerance: x=1.000000, y=1.000000, z=1.000000

Just to be clear, my intention is to set an orientation constraint, but no position constraint. It would be lovely if I could do this for links other than the end effector. I'm wondering if my approach is wrong, or maybe I am not correctly understanding how these orientation constraints work.

My code looks something like this:

move_group.setStartStateToCurrentState();

moveit::core::RobotStatePtr current_state = move_group.getCurrentState();

std::vector<double> joint_group_positions;
current_state->copyJointGroupPositions(joint_model_group, joint_group_positions);

// Set 2nd joint to target rotation of 1 radian, to bring arm "out of vertical" position.
joint_group_positions[2] = 1;

// Set base joint target to it's limit.
joint_group_positions[0] = joint_limit;  

move_group.setJointValueTarget(joint_group_positions);   // Send joint targets

// To keep things simple, I try just copying over the current pose, so I don't have to worry about my
// failure to understand quaternions just yet.
geometry_msgs::Pose start_pose = move_group.getCurrentPose().pose;
start_pose.orientation.x = move_group.getCurrentPose().pose.orientation.x;
start_pose.orientation.y = move_group.getCurrentPose().pose.orientation.y;
start_pose.orientation.z = move_group.getCurrentPose().pose.orientation.z;
start_pose.orientation.w = move_group.getCurrentPose().pose.orientation.w;

moveit_msgs::OrientationConstraint ocm;
ocm.link_name = "link6";
ocm.orientation.x = start_pose.orientation.x;
ocm.orientation.y = start_pose.orientation.y;
ocm.orientation.z = start_pose.orientation.z;
ocm.orientation.w = start_pose.orientation.w;

// Huge tolerance just to see if I get a result
ocm.absolute_x_axis_tolerance = 1;    
ocm.absolute_y_axis_tolerance = 1;
ocm.absolute_z_axis_tolerance = 1;

ocm.header.frame_id = "base_link";

// # A weighting factor for this constraint (denotes relative importance
// to other constraints. Closer to zero means less important)
ocm.weight = 1;

moveit_msgs::Constraints vertical_constraint;
vertical_constraint.orientation_constraints.push_back(ocm);
move_group.setPathConstraints(vertical_constraint);

moveit::planning_interface::MoveGroupInterface::Plan my_plan;

... etc ...

As usual, my intention is to really explicitly answer my own question here to help along future ROS travellers. After plugging in random values here and there willy-nilly I did manage at one point to get the planned path to show up in RViz, but it was an absolutely bizarre 2000 point trajectory. Perhaps someone can lend me some advice.

Thanks in advance.

edit retag flag offensive close merge delete

Comments

Just a quick note: I don't believe orientation constraints can be used in this way.

And personally I've always considered MoveIt to be a bit of a higher-level motion planning library. Homing to me is a low-level activity, perhaps even before "normal control" of the arm (whatever that means for your specific hw) is possible.

If you have the interface to your hw, it may be more appropriate to issue simple position setpoints or even JointTrajectorys which rotate only a single joint.

If using this for instance to calibrate or zero encoders, velocity setpoints might be more suitable.

gvdhoorn gravatar image gvdhoorn  ( 2021-01-20 01:42:35 -0500 )edit

It's true that I probably confuse things by bringing homing into the question at all. There is quite a bit more going on with my approach to homing. I will be keeping track of the state of the robot before it is powered off, and I will end be using very slow speeds and a different control loop on the MCU to check for limit switch activation. I'm just playing around with the idea of homing with some awareness of the planning-scene and initial pose. That said, the heart of my question is less about homing and more about Orientation Constraints, which I intend to use later down the line for higher-level planning as well.

When you say orientation constraints can't be used in this way, what do you mean? Is there a way to plan joint motion, while constraining the orientation of links?

DrewHamilton gravatar image DrewHamilton  ( 2021-01-20 09:08:47 -0500 )edit