ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

UR5 visual servoing

asked 2018-01-20 12:19:25 -0500

Abdu gravatar image

updated 2018-04-25 08:54:00 -0500

Hi

1- I can control UR5 arm in rviz and gazebo and everything works fine.

2- I use vision_visp to detect the object and I can see the object's pose from rostopic echo /visp_auto_acker/object_position

The issue is I don't know how to make the end-effector of ur5 tracking the detected object. I think that I have to create a node to subscribe to the object's pose that is already published, then this node will send the control commands to the real ur5 arm.

Anyone can help with that ??? how to start ?

edit retag flag offensive close merge delete

Comments

Hey Abdu, Did you ever get this working? I need to have my UR5 adjust its path from external force input and so is prob similar to impedance. I also need the robot to learn and generalize to similar situations but can't figure out how to get neural network output to publish data useful for Movit

porkPy gravatar image porkPy  ( 2019-02-07 16:38:55 -0500 )edit

hello,Abdu,how did you make the camera attached at the end effector. I am new for ROS .and i have been struggling in this for few weeks,can you give me some hints.Thank u very much!

reallllljy gravatar image reallllljy  ( 2019-05-04 22:26:28 -0500 )edit

@reallllljy I am not sure what do u mean, but I simply placed it at the end-effector of the manipulator.

Abdu gravatar image Abdu  ( 2019-05-05 19:26:07 -0500 )edit

Thanks for your reply,Abdu. Actually i don't how to place a camera at the end-effector. I guess it must be very easy for you, but its still a big problem for me,can u give me some hints? Thanks again

reallllljy gravatar image reallllljy  ( 2019-05-05 21:11:58 -0500 )edit

@reallllljy watch this video you can notice they attach a camera at the end of the arm, you can use normal webcam with usb-cable and attach it at the end

Abdu gravatar image Abdu  ( 2019-05-06 09:03:21 -0500 )edit

thanks a lot,i will figure it out

reallllljy gravatar image reallllljy  ( 2019-05-10 04:25:06 -0500 )edit

2 Answers

Sort by » oldest newest most voted
1

answered 2018-04-26 09:37:11 -0500

AndyZe gravatar image

updated 2018-04-26 09:53:44 -0500

Let me suggest 2 approaches.

1) Get the object pose with the visp_tracker. Use the look_at_pose package to calculate the pose of the robot arm that would look at the object (and keep the camera upright). Send the arm to that pose with MoveIt!, then move forward towards the object a bit more (i.e. +x in the end-effector frame). This approach will be a little bit slow and choppy while you wait for MoveIt! plans and the intermediate moves. You might also get large swings of the arm, aka reconfigurations.

2) Get the object pose with visp_tracker. Use the jog_arm package to send a small step towards the object. This would essentially be a 6-dimensional proportional controller: x_cmd = k*(object_x - robot_x), likewise for the other dimensions. jog_arm was developed on a UR5 so I'm sure it would work on your robot. Upside to this approach: smooth, fast motion and no reconfigurations. Downsides to this approach: you need to figure out the controller and the camera probably won't remain upright.

You could get creative and mix approach #1 and #2. Like, use look_at_pose to get a good arm pose, then use jog_arm to move towards it. That might be the best way to go.

Also, the UR5 tends to run into joint limits and singularities more often than we'd all like. C'est la vie

So, the tools you need are out there, but implementing it all is non-trivial, as @PeteBlackerThe3rd said.

edit flag offensive delete link more

Comments

Thanks, How do I send the end-effector based on the object pose (that i get from look_at_pose), is there any example.??

Abdu gravatar image Abdu  ( 2018-04-27 09:04:22 -0500 )edit
AndyZe gravatar image AndyZe  ( 2018-04-27 11:07:06 -0500 )edit
AndyZe gravatar image AndyZe  ( 2018-04-27 11:07:33 -0500 )edit
AndyZe gravatar image AndyZe  ( 2018-04-27 11:08:08 -0500 )edit

Hi Andy, The jog_arm example(python) is not valid. Can you please link it again? Thx

stevensu1838 gravatar image stevensu1838  ( 2019-05-27 21:16:33 -0500 )edit
0

answered 2018-04-25 15:19:29 -0500

updated 2018-04-27 05:21:05 -0500

This really is a non-trivial task you're trying to undertake. The challenge is reactive velocity control of the arm as opposed to deterministic goal based control. This is still an active research topic as far as I'm aware.

MoveIt has the ability to perform basic velocity control, but this will result in extreme behaviour near singularities which will throw your UR5 into safe mode. I have personal experience of doing exactly this.

Can you answer a couple of questions:

Are you using a wrist mounted camera or a statically mounted one?

Do you need the arm to follow the marker in real time?

Edit:

To implement real-time tracking you need to use velocity control of the arm. This is not too complicated to implement if you ignore the problem of singularities, I'll describe the theory below.

You'll need to create a control law that determines the desired angular and linear velocities of the end effector based upon the relative position of the marker. The velocity should be higher the further away the end effector is from its desired location, and you'll want a dead zone around the perfect position to avoid it oscillating around this.

Once you've got this desired velocity (twist message type), you'll need to get moveit to calculate the Jacobin matrix of the manipulator at its current location. This matrix can be used to translate the velocity of the end effector into a vector of joint velocities.

Now you simply pass these joint velocities to the UR driver to control the arm.

One limitation of this method is it will happily run straight into a singularity, solving that is a whole new problem. Hope this helps.

edit flag offensive delete link more

Comments

Thanks for sharing your answer, I want to do the basic level of visual servoing / object tracking with real UR5 6DOF arm. I have a camera mounted at the end-effector and I need the arm following the marker in real time.

Abdu gravatar image Abdu  ( 2018-04-25 16:12:34 -0500 )edit

What I did so far, I publish the maker's pose as a ROS topic and I create another node to subscribe to this marker's pose. I need to apply one of the visual servoing examples in order to calculate the 6 joints velocities. After that, I should publish the velocities on topic /ur_driver/joint_speed

Abdu gravatar image Abdu  ( 2018-04-25 16:17:20 -0500 )edit

I am stuck in the way of implementing marker's pose into the correct example, I am trying with this example PBVS, but I dont'e know yet how to make it work with my own input.

Abdu gravatar image Abdu  ( 2018-04-25 16:22:27 -0500 )edit

See edit, explaining what PBVS is doing in theory. It may be easier to implement the same thing from scratch because you're using a different controller.

PeteBlackerThe3rd gravatar image PeteBlackerThe3rd  ( 2018-04-27 05:23:03 -0500 )edit

Thanks for your explanation, I understand what you said. Do you have any example of visual servoing for 6DOF manipulator. My current goal is to find an example and then I will arrange it based on my robot UR5.

Abdu gravatar image Abdu  ( 2018-04-27 08:24:52 -0500 )edit

I don't know of any examples I'm afraid, you might have to do this the hard way. I'd recommend building and testing the twist velocity generator first and then working on the EE velocity to joint velocity conversion. Good luck.

PeteBlackerThe3rd gravatar image PeteBlackerThe3rd  ( 2018-04-27 13:59:05 -0500 )edit

Hello @PeteBlackerThe3rd do you know how to use moveit to calculate the jacobian matrix to convert from ee velocity to joint velocity?

aarontan gravatar image aarontan  ( 2019-01-25 14:46:47 -0500 )edit

Hi Abdu , Were you successful in implementing visual servoing using UR5 arm? I need some help as I am currently trying to implement visual servoing using an Aubo i5 robotic arm.How did you control robot using joint velocities from end effector velocities?

Seby Varghese gravatar image Seby Varghese  ( 2019-12-17 07:10:54 -0500 )edit

Question Tools

5 followers

Stats

Asked: 2018-01-20 12:19:25 -0500

Seen: 1,879 times

Last updated: Apr 27 '18