ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a proportional controller: x_cmd = k*(object_x - robot_x). jog_arm was developed on a UR5 so I'm sure it would work on your robot. Downside to this approach: you need to figure out the controller, and the camera probably won't remain upright.

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a 6-dimensional proportional controller: x_cmd = k*(object_x - robot_x). robot_x), likewise for the other dimensions. jog_arm was developed on a UR5 so I'm sure it would work on your robot. Downside to this approach: you need to figure out the controller, and the camera probably won't remain upright.

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a 6-dimensional proportional controller: x_cmd = k*(object_x - robot_x), likewise for the other dimensions. jog_arm was developed on a UR5 so I'm sure it would work on your robot. Downside Upside to this approach: smooth, fast motion. Downsides to this approach: you need to figure out the controller, controller and the camera probably won't remain upright.

Also, the UR5 tends to run into joint limits and singularities more often than we'd all like.

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves.moves. You might also get large swings of the arm, aka reconfigurations.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a 6-dimensional proportional controller: x_cmd = k*(object_x - robot_x), likewise for the other dimensions. jog_arm was developed on a UR5 so I'm sure it would work on your robot. Upside to this approach: smooth, fast motion. motion and no reconfigurations. Downsides to this approach: you need to figure out the controller and the camera probably won't remain upright.

Also, the UR5 tends to run into joint limits and singularities more often than we'd all like.

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves. You might also get large swings of the arm, aka reconfigurations.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a 6-dimensional proportional controller: x_cmd = k*(object_x - robot_x), likewise for the other dimensions. jog_arm was developed on a UR5 so I'm sure it would work on your robot. Upside to this approach: smooth, fast motion and no reconfigurations. Downsides to this approach: you need to figure out the controller and the camera probably won't remain upright.

Also, the UR5 tends to run into joint limits and singularities more often than we'd all like.like. C'est la vie

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves. You might also get large swings of the arm, aka reconfigurations.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a 6-dimensional proportional controller: x_cmd = k*(object_x - robot_x), likewise for the other dimensions. jog_arm was developed on a UR5 so I'm sure it would work on your robot. Upside to this approach: smooth, fast motion and no reconfigurations. Downsides to this approach: you need to figure out the controller and the camera probably won't remain upright.

You could get creative and mix approach #1 and #2. Like, use look_at_pose to get a good arm pose, then use jog_arm to move towards it. That might be the best way to go.

Also, the UR5 tends to run into joint limits and singularities more often than we'd all like. C'est la vie

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.