ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

IK works in planning visualizer but often fails on real robot

asked 2011-08-16 11:18:59 -0500

Pi Robot gravatar image

updated 2011-08-16 11:20:32 -0500

Hello arm navigation gurus,

I am running the latest and greatest Electric arm_navigation Debian packages on an Ubuntu 10.04 machine. Using the planning wizard is a breeze and I have defined one kinematic group for my robot's 6-dof right arm (see here for URDF). Next I tried things out in the planning visualizer and the IK engine has no trouble finding solutions for random movements of the right hand (no gripper).

Turning to the real robot, I am using the dynamixel_motor stack to drive the arm and I have successfully tested the basic topics and services that control the servos. The joint states are published on the /joint_states topic at 20 Hz and moving the arm passively is reflected smoothly in RViz.

I then wrote a simple move_arm test script in Python that mimics the C++ tutorial found here. I have included the script below. The target pose in the script was taken from the FK when the arm was passively moved in front of the robot and slight bent. I then moved the arm around into various joint positions and ran my test script manually to see if an IK solution could be found.

In some cases, a solution is found and the arm moves over a smooth trajectory to place the hand correctly. However, there seems to be only a small bubble around the desired pose where IK solutions can be found. This is in contrast to the planning visualizer where solutions were not a problem. I'm guessing there is something missing from my test script related to initializing the environment scene but I'm not sure how to do that with Python.

Any thoughts?

--patrick

Here now is my test script:

#! /usr/bin/python
import roslib
roslib.load_manifest('pi_dynamixel_controller')

import rospy
import actionlib
from actionlib_msgs.msg import *
from pr2_controllers_msgs.msg import *
from trajectory_msgs.msg import *
from dynamixel_controllers.srv import TorqueEnable
from arm_navigation_msgs.msg import *
from control_msgs.msg import FollowJointTrajectoryAction

def poseConstraintToPositionOrientationConstraints(pose_constraint):
    position_constraint = PositionConstraint()
    orientation_constraint = OrientationConstraint()
    position_constraint.header = pose_constraint.header
    position_constraint.link_name = pose_constraint.link_name
    position_constraint.position = pose_constraint.pose.position
    position_constraint.constraint_region_shape.type = 0
    position_constraint.constraint_region_shape.dimensions.append(2*pose_constraint.absolute_position_tolerance.x)
    position_constraint.constraint_region_shape.dimensions.append(2*pose_constraint.absolute_position_tolerance.y)
    position_constraint.constraint_region_shape.dimensions.append(2*pose_constraint.absolute_position_tolerance.z)

    position_constraint.constraint_region_orientation.x = 0.0
    position_constraint.constraint_region_orientation.y = 0.0
    position_constraint.constraint_region_orientation.z = 0.0
    position_constraint.constraint_region_orientation.w = 1.0

    position_constraint.weight = 1.0

    orientation_constraint.header = pose_constraint.header
    orientation_constraint.link_name = pose_constraint.link_name
    orientation_constraint.orientation = pose_constraint.pose.orientation
    orientation_constraint.type = pose_constraint.orientation_constraint_type

    orientation_constraint.absolute_roll_tolerance = pose_constraint.absolute_roll_tolerance
    orientation_constraint.absolute_pitch_tolerance = pose_constraint.absolute_pitch_tolerance
    orientation_constraint.absolute_yaw_tolerance = pose_constraint.absolute_yaw_tolerance
    orientation_constraint.weight = 1.0

    return (position_constraint, orientation_constraint)


def addGoalConstraintToMoveArmGoal(pose_constraint, move_arm_goal):
    position_constraint, orientation_constraint = poseConstraintToPositionOrientationConstraints(pose_constraint);
    move_arm_goal.motion_plan_request.goal_constraints.position_constraints.append(position_constraint)
    move_arm_goal.motion_plan_request.goal_constraints.orientation_constraints.append(orientation_constraint)


rospy.init_node('place_hand', anonymous=True)
side = rospy.get_param("~side", "right")

move_arm = actionlib.SimpleActionClient("move_" + side + "_arm", MoveArmAction)

dynamixel_namespace = '/dynamixel_controller'
dynamixels = rospy.get_param(dynamixel_namespace + '/dynamixels', dict())

servo_torque_enable = list()

for name in sorted(dynamixels):
    torque_enable_service = dynamixel_namespace + '/' + name + '_controller/torque_enable'
    rospy.wait_for_service(torque_enable_service)  
    servo_torque_enable.append(rospy.ServiceProxy(torque_enable_service, TorqueEnable))

move_arm ...
(more)
edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
2

answered 2011-08-24 05:57:29 -0500

egiljones gravatar image

Patrick - sorry, didn't see your comment until now.

It's a little confusing to figure out how to set parameters for a the kdl_plugin - you have to set them in the node and then in the namespace for the group - this is so a node can have several different kinematics solvers. Setting the following parameters for your arm makes IK take about 1.4 seconds when there's no solution on my desktop. Not positive whether it'll work better for you in random poses, but it's worth a try:

  <node pkg="arm_kinematics_constraint_aware" type="arm_kinematics_constraint_aware" name="pi_robot_right_arm_kinematics">
        <param name="group" type="string" value="right_arm" />
        <param name="right_arm/root_name" type="string" value="lower_torso_link" />
        <param name="right_arm/tip_name" type="string" value="right_wrist_link" />
        <param name="right_arm/max_solver_iterations" type="int" value="10000" />
        <param name="right_arm/max_search_iterations" type="int" value="10" />
        <param name="kinematics_solver" type="string" value="arm_kinematics_constraint_aware/KDLArmKinematicsPlugin" />
    </node>
edit flag offensive delete link more

Comments

Thanks Gil--I hope to give it a try this weekend and will post back my results.
Pi Robot gravatar image Pi Robot  ( 2011-08-24 13:26:57 -0500 )edit
I got to this sooner than I expected. The parameters definitely now make a difference to the time it takes the IK to fail but it doesn't seem to expand the solution space very much. Perhaps this is just a limitation of the numerical solver. I'm finding the OpenRAVE analytic solver to be very good so I'll likely pursue that avenue for awhile.
Pi Robot gravatar image Pi Robot  ( 2011-08-25 08:36:32 -0500 )edit
3

answered 2011-08-17 04:36:33 -0500

egiljones gravatar image

Patrick,

This is a shortcoming of the current plugin being used in arm_kinematics_constraint_aware. As it's a numerical optimization solver, the KDL IK is very sensitive to the seed state at which the search begins. It will generally find a good solution quickly when the seed state is close to the goal state - this is why the mode that you use in the planning_components_visualizer works so well, as we constantly update the seed state such that each new pose is only slightly different than the old pose.

When you are invoking move_arm this will be set by default to the current state of the robot's arm - when you are dragging the arm further and further away from the desired pose the seed state gets worse and worse, and the numerical optimization may get stuck in a local minima.

There are a couple parameters you can tune in the IK plugin that will make it try harder to find solutions. The first is the 'max_solver_iterations' - it's set to 500 by default, but you may get better performance increasing it to 1000 or more. The second is 'max_search_iterations' - currently set to 3 by default. If KDL fails to converge by the requested number of iterations starting with the supplied seed state, it will randomly reseed the joints in between the limits and try again, repeating this process up to the requested number of search iterations.

The side-effect of increasing these parameters is that when KDL doesn't find a solution it can take a long time to return. On my machine the current time to return with no solution given the default parameters is about 10 milliseconds, which doesn't seem like a long time, but the PR2 analytic solver will return a failure in less than 10 microseconds.

The last option is to explore OpenRAVE's ikfast, which should provide analytical solutions quickly but is not currently integrated into the wizard pipeline. I'll be looking into integrating it but it probably won't make it into a release until F-turtle.

edit flag offensive delete link more

Comments

Thanks Gil--that clears it all up. I'll play with the parameters you describe. I have also successfully run OpenRAVE's ikfast with orrosplanning and it does indeed seem to find solutions in a larger configuration space. Taken together, this should keep me busy until F-turtle. :-)
Pi Robot gravatar image Pi Robot  ( 2011-08-17 04:51:11 -0500 )edit
I thought I would be able to figure out where to change the max_solver_iterations and max_search_iterations parameters but no matter where I place them in the autogenerated arm navigation launch or config files, they have no effect, even when I set them really large (like 100000). By "no effect" I mean that the arm planning terminal reports "No IK solution found" almost instantly even when I use very large parameter values. So where exactly do I set these parameters? Thanks!
Pi Robot gravatar image Pi Robot  ( 2011-08-20 12:12:45 -0500 )edit

Is this the problem that all the ROS user who use arm_navigation utility will encounter? Can we solve the IK solver problem completely, such that the arm navigation can be stable?

Albert K gravatar image Albert K  ( 2013-05-26 17:40:09 -0500 )edit

Question Tools

Stats

Asked: 2011-08-16 11:18:59 -0500

Seen: 886 times

Last updated: Oct 03 '11