ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

The way how you go about it is by using MoveIt Task Constructor. In this tool, you use the concept of stages to achieve your end-to-end solution.
You need to add the below stages to make your arm follow a straight line while maintaining orientation constraints:

In addition, you would need to figure out which axes in the MoveRelative Stage would correspond to your line axis.
The official tutorials mention a detailed setup on how to use stages for Pick and Place.

Edit:

@130s I have previously faced the issue of not achieving a sizeable amount of fraction using computeCartesianPath. Hence, we used MTC to tackle this, as we couldn't find the solution as to what was limiting the range of motion.
@angcorcue2 The way how following can still be achieved using MTC by combining the mentioned stages.

I have attached a minor demo with ROS1 below, indicating the structure of the stages you go about it is require for the use case you mentioned. However, these stages can easily be ported to ROS2 syntax following the tutorial.

Prerequisites


  • We begin the pipeline by using MoveIt Task Constructorthe Current State. In this tool, stage. We assume that the arm is expected to not be in an attached state:

    // Applicability Stage
     moveit::task_constructor::Stage* current_state_ptr = nullptr;
    {
      auto current_state =
        std::make_unique<moveit::task_constructor::stages::CurrentState>(
          "current state");
    
      auto applicability_filter =
          std::make_unique<moveit::task_constructor::stages::PredicateFilter>(
            "applicability test", std::move(current_state));
      applicability_filter->setPredicate([object](
        const moveit::task_constructor::SolutionBase& s, std::string& comment) {
        if (s.start()->scene()->getCurrentState().hasAttachedBody(object)) {
          comment = "object with id '" + object +
            "' is already attached and cannot be picked";
          return false;
        }
        return true;
      });
      current_state_ptr = applicability_filter.get();
      t.add(std::move(applicability_filter));
    }
    
  • We then configure a Connect stage that is connected to a SerialContainer that consists of additional substages.

    // Move to Pick Connect Stage - non-cartesian
    {
      auto stage = std::make_unique<moveit::task_constructor::stages::Connect>(
        "move to pick",
          moveit::task_constructor::stages::Connect::GroupPlannerVector{
            { arm_group_name_, sampling_planner } });
      stage->setTimeout(60.0);
      stage->properties().configureInitFrom(
        moveit::task_constructor::Stage::PARENT);
      t.add(std::move(stage));
    }
    
  • Our SerialContainer Stage would consist of a MoveRelative Stage to enable us to reach the first pose in a straight line and a GenerateCustomPose stage to generate a grasp of the said pose.

    // Pick object connect the pipeline
    moveit::task_constructor::Stage* attach_object_stage = nullptr;
    {
      auto grasp =
        std::make_unique<moveit::task_constructor::SerialContainer>(
          "pick object");
      t.properties().exposeTo(grasp->properties(),
        { "eef", "hand", "group", "ik_frame" });
      grasp->properties().configureInitFrom(
        moveit::task_constructor::Stage::PARENT,
          { "eef", "hand", "group", "ik_frame" });
    
      // Approach object - substage
      {
        auto stage =
          std::make_unique<moveit::task_constructor::stages::MoveRelative>(
              "Approach object", cartesian_planner);
        stage->properties().set("marker_ns", "approach_object");
        stage->properties().set("link", hand_frame_);
        stage->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT, { "group" });
        stage->setMinMaxDistance(
            approach_object_min_dist_, approach_object_max_dist_);
    
        // Set hand forward direction
        geometry_msgs::Vector3Stamped vec;
        vec.header.frame_id = hand_frame_;
        vec.vector.z = 1.0;
        stage->setDirection(vec);
        grasp->insert(std::move(stage));
      }
    
      // Generate grasp pose - substage
      {
        std::vector<geometry_msgs::PoseStamped> CustomPoses;
        geometry_msgs::PoseStamped p;
        p.header.frame_id = planning_frame_;
        p.pose = object_pose_;
        CustomPoses.push_back(p);
    
        auto stage =
          std::make_unique<
            moveit::task_constructor::stages::GenerateCustomPose>(
            "generate grasp pose");
        stage->setCustomPoses(CustomPoses);
        stage->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT);
        stage->setMonitoredStage(current_state_ptr);
    
        // Compute IK - substage
        auto wrapper =
          std::make_unique<moveit::task_constructor::stages::ComputeIK>(
            "grasp pose IK", std::move(stage));
        wrapper->setMaxIKSolutions(1);
        wrapper->setMinSolutionDistance(1.0);
        wrapper->setIKFrame(grasp_frame_transform_, hand_frame_);
    
        wrapper->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT, { "eef", "group" });
        wrapper->properties().configureInitFrom(
            moveit::task_constructor::Stage::INTERFACE, { "target_pose" });
        grasp->insert(std::move(wrapper));
      }
      t.add(std::move(grasp));
    }
    

    This results in the following motion:

Final Pose Parametric Adjustment


  • To achieve the second pose, we would need the calculate the unit vector in question.
    For instance, let our first point be at (0.35, -0.15, 0.635), and our second point at (0.586, 0.096, 0.880):

    unit_vector = (dx/mag, dy/mag, dz/mag)
    
    where:
       - dx = (x2 - x1) 
       - dy = (y2 - y1) 
       - dz = (z2 - z1)
       - mag = sqrt(dx^2 + dy^2 + dz^2)
    

    Then by substituting these values into the above equation, we get:

    unit_vector =  (0.562, 0.587, 0.584)  # (ux, uy, uz)
    
  • Secondly, we need to calculate the distance between these two points:

    distance = sqrt(dx^2 + dy^2 + dz^2) = 0.420
    
  • Finally, we substitute these values in our stage:

    // Move 3D - Stage
    {
      auto stage =
        std::make_unique<moveit::task_constructor::stages::MoveRelative>(
            "move 3d", cartesian_planner);
      stage->properties().set("marker_ns", "move_3d");
      stage->properties().set("link", hand_frame_);
      stage->properties().configureInitFrom(
          moveit::task_constructor::Stage::PARENT, { "group" });
      // setting minimum distance slightly lower to ensure max distance is reached
      stage->setMinMaxDistance(0.4, 0.42);
    
      // Set hand forward direction
      geometry_msgs::Vector3Stamped vec;
      vec.header.frame_id = hand_frame_;
      vec.vector.x = 0.562;
      vec.vector.y = 0.587;
      vec.vector.z = 0.584;
      stage->setDirection(vec);
      attach_object_stage = stage.get();
      t.add(std::move(stage));
    }
    

    This results in the following motion:

Things to Note


Cartesian motions across three axes simultaneously give a very low range of motion. It is recommended that if you use the concept of stages to achieve your end-to-end solution.
You
do not need to add the below stages to make your arm follow a straight line while maintaining orientation constraints:

In addition, you would need to figure out which axes doing so, use OMPL-constrained planning instead.
Integration of orientational constraints with OMPL works very reliably
in the MoveRelative Stage would correspond to your line axis.
The official tutorials mention a detailed setup on how to use
stages for Pick and Place.

using MTC.

Edit:

@130s I have previously faced the issue of not achieving a sizeable amount of fraction using computeCartesianPath. Hence, we used MTC to tackle this, as we couldn't find the solution as to what was limiting the range of motion.
@angcorcue2 The following can still be achieved using MTC by combining the mentioned stages.

I have attached a minor demo with ROS1 below, indicating the structure of the stages you require for the use case you mentioned. However, these stages can easily be ported to ROS2 syntax following the tutorial.

Prerequisites


  • We begin the pipeline by using the Current State stage. We assume that the arm is expected to not be in an attached state:

    // Applicability Stage
     moveit::task_constructor::Stage* current_state_ptr = nullptr;
    {
      auto current_state =
        std::make_unique<moveit::task_constructor::stages::CurrentState>(
          "current state");
    
      auto applicability_filter =
          std::make_unique<moveit::task_constructor::stages::PredicateFilter>(
            "applicability test", std::move(current_state));
      applicability_filter->setPredicate([object](
        const moveit::task_constructor::SolutionBase& s, std::string& comment) {
        if (s.start()->scene()->getCurrentState().hasAttachedBody(object)) {
          comment = "object with id '" + object +
            "' is already attached and cannot be picked";
          return false;
        }
        return true;
      });
      current_state_ptr = applicability_filter.get();
      t.add(std::move(applicability_filter));
    }
    
  • We then configure a Connect stage that is connected to a SerialContainer that consists of additional substages.

    // Move to Pick Connect Stage - non-cartesian
    {
      auto stage = std::make_unique<moveit::task_constructor::stages::Connect>(
        "move to pick",
          moveit::task_constructor::stages::Connect::GroupPlannerVector{
            { arm_group_name_, sampling_planner } });
      stage->setTimeout(60.0);
      stage->properties().configureInitFrom(
        moveit::task_constructor::Stage::PARENT);
      t.add(std::move(stage));
    }
    
  • Our SerialContainer Stage would consist of a MoveRelative Stage to enable us to reach the first pose in a straight line and a GenerateCustomPose stage to generate a grasp of the said pose.

    // Pick object connect the pipeline
    moveit::task_constructor::Stage* attach_object_stage = nullptr;
    {
      auto grasp =
        std::make_unique<moveit::task_constructor::SerialContainer>(
          "pick object");
      t.properties().exposeTo(grasp->properties(),
        { "eef", "hand", "group", "ik_frame" });
      grasp->properties().configureInitFrom(
        moveit::task_constructor::Stage::PARENT,
          { "eef", "hand", "group", "ik_frame" });
    
      // Approach object - substage
      {
        auto stage =
          std::make_unique<moveit::task_constructor::stages::MoveRelative>(
              "Approach object", cartesian_planner);
        stage->properties().set("marker_ns", "approach_object");
        stage->properties().set("link", hand_frame_);
        stage->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT, { "group" });
        stage->setMinMaxDistance(
            approach_object_min_dist_, approach_object_max_dist_);
    
        // Set hand forward direction
        geometry_msgs::Vector3Stamped vec;
        vec.header.frame_id = hand_frame_;
        vec.vector.z = 1.0;
        stage->setDirection(vec);
        grasp->insert(std::move(stage));
      }
    
      // Generate grasp pose - substage
      {
        std::vector<geometry_msgs::PoseStamped> CustomPoses;
        geometry_msgs::PoseStamped p;
        p.header.frame_id = planning_frame_;
        p.pose = object_pose_;
        CustomPoses.push_back(p);
    
        auto stage =
          std::make_unique<
            moveit::task_constructor::stages::GenerateCustomPose>(
            "generate grasp pose");
        stage->setCustomPoses(CustomPoses);
        stage->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT);
        stage->setMonitoredStage(current_state_ptr);
    
        // Compute IK - substage
        auto wrapper =
          std::make_unique<moveit::task_constructor::stages::ComputeIK>(
            "grasp pose IK", std::move(stage));
        wrapper->setMaxIKSolutions(1);
        wrapper->setMinSolutionDistance(1.0);
        wrapper->setIKFrame(grasp_frame_transform_, hand_frame_);
    
        wrapper->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT, { "eef", "group" });
        wrapper->properties().configureInitFrom(
            moveit::task_constructor::Stage::INTERFACE, { "target_pose" });
        grasp->insert(std::move(wrapper));
      }
      t.add(std::move(grasp));
    }
    

    This results in the following motion:

Final Pose Parametric Adjustment


  • To achieve the second pose, we would need the calculate the unit vector in question.
    For instance, let our first point be at (0.35, -0.15, 0.635), and our second point at (0.586, 0.096, 0.880):

    unit_vector = (dx/mag, dy/mag, dz/mag)
    
    where:
       - dx = (x2 - x1) 
       - dy = (y2 - y1) 
       - dz = (z2 - z1)
       - mag = sqrt(dx^2 + dy^2 + dz^2)
    

    Then by substituting these values into the above equation, we get:

    unit_vector =  (0.562, 0.587, 0.584)  # (ux, uy, uz)
    
  • Secondly, we need to calculate the distance between these two points:

    distance = sqrt(dx^2 + dy^2 + dz^2) = 0.420
    
  • Finally, we substitute these values in our stage:

    // Move 3D - Stage
    {
      auto stage =
        std::make_unique<moveit::task_constructor::stages::MoveRelative>(
            "move 3d", cartesian_planner);
      stage->properties().set("marker_ns", "move_3d");
      stage->properties().set("link", hand_frame_);
      stage->properties().configureInitFrom(
          moveit::task_constructor::Stage::PARENT, { "group" });
      // setting minimum distance slightly lower to ensure max distance is reached
      stage->setMinMaxDistance(0.4, 0.42);
    
      // Set hand forward direction
      geometry_msgs::Vector3Stamped vec;
      vec.header.frame_id = hand_frame_;
      vec.vector.x = 0.562;
      vec.vector.y = 0.587;
      vec.vector.z = 0.584;
      stage->setDirection(vec);
      attach_object_stage = stage.get();
      t.add(std::move(stage));
    }
    

    This results in the following motion:

Things to Note


Cartesian motions across three axes simultaneously give a very low range of motion. It is recommended that if you do not need to follow a straight line doing so, use OMPL-constrained planning instead.
Integration of orientational constraints with OMPL works very reliably in stages using MTC.

The way how you go about it is by using MoveIt Task Constructor. In this tool, you use the concept of stages to achieve your end-to-end solution. You need to add the below stages to make your arm follow a straight line while maintaining orientation constraints:

  1. Current State Stage
  2. Move Relative Stage In addition, you would need to figure out which axes in the MoveRelative Stage would correspond to your line axis. The official tutorials mention a detailed setup on how to use stages for Pick and Place.

Edit:

@130s I have previously faced the issue of not achieving a sizeable amount of fraction using computeCartesianPath. Hence, we used MTC to tackle this, as we couldn't find the solution as to what was limiting the range of motion.
@angcorcue2 The following can still be achieved using MTC by combining the mentioned stages.

I have attached a minor demo with ROS1 below, indicating the structure of the stages you require for the use case you mentioned. However, these stages can easily be ported to ROS2 syntax following the tutorial.

Prerequisites


  • We begin the pipeline by using the Current State stage. We assume that the arm is expected to not be in an attached state:

    // Applicability Stage
     moveit::task_constructor::Stage* current_state_ptr = nullptr;
    {
      auto current_state =
        std::make_unique<moveit::task_constructor::stages::CurrentState>(
          "current state");
    
      auto applicability_filter =
          std::make_unique<moveit::task_constructor::stages::PredicateFilter>(
            "applicability test", std::move(current_state));
      applicability_filter->setPredicate([object](
        const moveit::task_constructor::SolutionBase& s, std::string& comment) {
        if (s.start()->scene()->getCurrentState().hasAttachedBody(object)) {
          comment = "object with id '" + object +
            "' is already attached and cannot be picked";
          return false;
        }
        return true;
      });
      current_state_ptr = applicability_filter.get();
      t.add(std::move(applicability_filter));
    }
    
  • We then configure a Connect stage that is connected to a SerialContainer that consists of additional substages.

    // Move to Pick Connect Stage - non-cartesian
    {
      auto stage = std::make_unique<moveit::task_constructor::stages::Connect>(
        "move to pick",
          moveit::task_constructor::stages::Connect::GroupPlannerVector{
            { arm_group_name_, sampling_planner } });
      stage->setTimeout(60.0);
      stage->properties().configureInitFrom(
        moveit::task_constructor::Stage::PARENT);
      t.add(std::move(stage));
    }
    
  • Our SerialContainer Stage would consist of a MoveRelative Stage to enable us to reach the first pose in a straight line and a GenerateCustomPose stage to generate a grasp of the said pose.

    // Pick object connect the pipeline
    moveit::task_constructor::Stage* attach_object_stage = nullptr;
    {
      auto grasp =
        std::make_unique<moveit::task_constructor::SerialContainer>(
          "pick object");
      t.properties().exposeTo(grasp->properties(),
        { "eef", "hand", "group", "ik_frame" });
      grasp->properties().configureInitFrom(
        moveit::task_constructor::Stage::PARENT,
          { "eef", "hand", "group", "ik_frame" });
    
      // Approach object - substage
      {
        auto stage =
          std::make_unique<moveit::task_constructor::stages::MoveRelative>(
              "Approach object", cartesian_planner);
        stage->properties().set("marker_ns", "approach_object");
        stage->properties().set("link", hand_frame_);
        stage->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT, { "group" });
        stage->setMinMaxDistance(
            approach_object_min_dist_, approach_object_max_dist_);
    
        // Set hand forward direction
        geometry_msgs::Vector3Stamped vec;
        vec.header.frame_id = hand_frame_;
        vec.vector.z = 1.0;
        stage->setDirection(vec);
        grasp->insert(std::move(stage));
      }
    
      // Generate grasp pose - substage
      {
        std::vector<geometry_msgs::PoseStamped> CustomPoses;
        geometry_msgs::PoseStamped p;
        p.header.frame_id = planning_frame_;
        p.pose = object_pose_;
        CustomPoses.push_back(p);
    
        auto stage =
          std::make_unique<
            moveit::task_constructor::stages::GenerateCustomPose>(
            "generate grasp pose");
        stage->setCustomPoses(CustomPoses);
        stage->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT);
        stage->setMonitoredStage(current_state_ptr);
    
        // Compute IK - substage
        auto wrapper =
          std::make_unique<moveit::task_constructor::stages::ComputeIK>(
            "grasp pose IK", std::move(stage));
        wrapper->setMaxIKSolutions(1);
        wrapper->setMinSolutionDistance(1.0);
        wrapper->setIKFrame(grasp_frame_transform_, hand_frame_);
    
        wrapper->properties().configureInitFrom(
            moveit::task_constructor::Stage::PARENT, { "eef", "group" });
        wrapper->properties().configureInitFrom(
            moveit::task_constructor::Stage::INTERFACE, { "target_pose" });
        grasp->insert(std::move(wrapper));
      }
      t.add(std::move(grasp));
    }
    

    This results in the following motion:

Final Pose Parametric Adjustment


  • To achieve the second pose, we would need the calculate the unit vector in question.
    For instance, let our first point be at (0.35, -0.15, 0.635), and our second point at (0.586, 0.096, 0.880):

    unit_vector = (dx/mag, dy/mag, dz/mag)
    
    where:
       - dx = (x2 - x1) 
       - dy = (y2 - y1) 
       - dz = (z2 - z1)
       - mag = sqrt(dx^2 + dy^2 + dz^2)
    

    Then by substituting these values into the above equation, we get:

    unit_vector =  (0.562, 0.587, 0.584)  # (ux, uy, uz)
    
  • Secondly, we need to calculate the distance between these two points:

    distance = sqrt(dx^2 + dy^2 + dz^2) = 0.420
    
  • Finally, we substitute these values in our stage:

    // Move 3D - Stage
    {
      auto stage =
        std::make_unique<moveit::task_constructor::stages::MoveRelative>(
            "move 3d", cartesian_planner);
      stage->properties().set("marker_ns", "move_3d");
      stage->properties().set("link", hand_frame_);
      stage->properties().configureInitFrom(
          moveit::task_constructor::Stage::PARENT, { "group" });
      // setting minimum distance slightly lower to ensure max distance is reached
      stage->setMinMaxDistance(0.4, 0.42);
    
      // Set hand forward direction
      geometry_msgs::Vector3Stamped vec;
      vec.header.frame_id = hand_frame_;
      vec.vector.x = 0.562;
      vec.vector.y = 0.587;
      vec.vector.z = 0.584;
      stage->setDirection(vec);
      attach_object_stage = stage.get();
      t.add(std::move(stage));
    }
    

    This results in the following motion:

Things to Note


Cartesian motions across three axes simultaneously give a very low range of motion. It is recommended that if you do not need to follow a straight line doing so, use OMPL-constrained planning instead.
Integration of orientational constraints with OMPL works very reliably in stages using MTC.