Robotics StackExchange | Archived questions

Generate trajectory with moveIt based on sensor inputs

Hey there,

I would like to ask if it is possible to use MoveIt and generate a trajectory control law.

I would like to be able to write some Cpp Code using the MoveIt Library and some sensor Input topics (img or point clouds) and control according to these sensor inputs the endefector of this kuka arm.

Now what I already achieved was to connect RVIZ (moveIt) and gazebo. You also can check this out by checking this link. I am able to plan and execute a trajectory in moveit and it also will be executed in gazebo.

Now if somebody knows some examples on how to control a trajectory with moveit library based on sensor input I would be glat to here them.

Meanwhile I read here:

move_group talks to the robot through ROS topics and actions. It communicates with the robot to get current state information (positions of the joints, etc.), to get point clouds and other sensor data from the robot sensors and to talk to the controllers on the robot.

Controller Interface

movegroup talks to the controllers on the robot using the FollowJointTrajectoryAction interface. This is a ROS action interface. A server on the robot needs to service this action - this server is not provided by movegroup itself. move_group will only instantiate a client to talk to this controller action server on your robot.

If I also start my current kuka arm via:

roslaunch kuka_kr5_gazebo rviz_connected_with_gz_using_moveit.launch

I use currently use this controller (http://wiki.ros.org/joint_trajectory_controller) and an action server to connect rviz with gazebo:

/arm_controller/command
/arm_controller/follow_joint_trajectory/cancel
/arm_controller/follow_joint_trajectory/feedback
/arm_controller/follow_joint_trajectory/goal
/arm_controller/follow_joint_trajectory/result
/arm_controller/follow_joint_trajectory/status

should I also use this action server to execute a continuous trajectory with it according to my sensor input?

rostopic info /arm_controller/follow_joint_trajectory/goal
Type: control_msgs/FollowJointTrajectoryActionGoal

Publishers: 
 * /move_group (http://markus:44193/)

Subscribers: 
 * /gazebo (http://markus:46419/)

I also have several more topics running (just for additional information)

/joint_states
/move_group/cancel
/move_group/display_contacts
/move_group/display_planned_path
/move_group/feedback
/move_group/goal
/move_group/monitored_planning_scene
/move_group/ompl/parameter_descriptions
/move_group/ompl/parameter_updates
/move_group/plan_execution/parameter_descriptions
/move_group/plan_execution/parameter_updates
/move_group/planning_scene_monitor/parameter_descriptions
/move_group/planning_scene_monitor/parameter_updates
/move_group/result
/move_group/sense_for_plan/parameter_descriptions
/move_group/sense_for_plan/parameter_updates
/move_group/status
/move_group/trajectory_execution/parameter_descriptions
/move_group/trajectory_execution/parameter_updates

I just found some sources which do similar things I wanna do:

Asked by Markus on 2018-05-20 01:48:50 UTC

Comments

What exactly are you trying to achieve? visual servoing, dynamic pick n place, real-time collision avoidance? There are many ways a robot can be controlled using sensor input.

Asked by PeteBlackerThe3rd on 2018-05-21 08:07:34 UTC

I would like to write a service which I can send a position of the endeffector.

Asked by Markus on 2018-05-21 11:53:52 UTC

Answers