ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

[Moveit] With respect to the camera move from one static position to another dynamic position

asked 2021-06-30 04:43:34 -0500

Ranjit Kathiriya gravatar image

updated 2022-04-30 13:20:14 -0500

lucasw gravatar image

Hello there,

I can identify positions from the camera

  1. Gripper position
  2. BackLeft
  3. Backright
  4. FrontLeft

Currently, I a getting these things from the camera, now I want to move Gripper's point to a particular teat position every time. Example: I will have two cloud points 1. Gripper and 2. for any teat location (Let's say back right) based on this, both point 3d coordinate how can I move my robot arm in such a way that my gripper cloud-point will touch to my back-right teat's cloud point?

To sum up my question: I got my Gripper coordinate and any teat coordinate from point cloud; how can I move the robot arm gripper point so that it goes to teat position?

image description


The in-depth explanation for this issue:

Currently, I am using hand-eye calibration and doing my task to identify the teat and, based on hand-eye calibration, moving my robotic arm concerning the world(robot base) position. It is working fine for now.

I want to get rid of hand-eye calibration. Instead, my camera should able to detect gripper and teat (Which I can do). How can I move my gripper on the teat position if the camera detects both things? I thought this because animals are uncertain and can kick cameras, and each time farmers cannot afford hand-eye calibration if the camera move.

edit retag flag offensive close merge delete

Comments

I'm not sure I understand your question. I haven't seen a question mark anywhere.

Also, these pictures crack me up every time, so make sure to keep including them. Thanks.

fvd gravatar image fvd  ( 2021-07-03 02:46:45 -0500 )edit

Thanks, @fvd for providing feedback to my question, have updated it in detail. I think now you can able to understand the question. Thanks once again for your time.

Ranjit Kathiriya gravatar image Ranjit Kathiriya  ( 2021-07-05 02:54:50 -0500 )edit
1

@Ranjit: why did you remove the images?

gvdhoorn gravatar image gvdhoorn  ( 2021-07-12 03:49:39 -0500 )edit

I thought that it was not of that much used. If you want me to put this I can do it.

Ranjit Kathiriya gravatar image Ranjit Kathiriya  ( 2021-07-12 03:51:15 -0500 )edit
1

Your references to:

2. BackLeft

3. Backright

4. FrontLeft

don't really make sense any more right now, nor the references to "teat" or what you are trying to do in general.

I would indeed suggest to revert the removal.

gvdhoorn gravatar image gvdhoorn  ( 2021-07-12 03:55:45 -0500 )edit

Thanks for the suggestion. Yes, you are right and I have done it.

Ranjit Kathiriya gravatar image Ranjit Kathiriya  ( 2021-07-12 03:57:57 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
0

answered 2021-07-12 02:50:30 -0500

Ranjit Kathiriya gravatar image

Assuming the camera is mounted on the robot arm somewhere, hand-eye calibration is unavoidable. It will give you the pose of the camera w.r.t. robot base, bTc. Next, you get the pose of the teat w.r.t. camera, tTc. And you know the current position of the end effector (robot end point) w.r.t. robot base, bTe. From these, you can calculate the pose of teat w.r.t. end effector as eTt = inv(bTe) x bTc x cTt. Then you move your end effector by that amount, which should bring it to the teat.

Thanks @samarth, for providing solution

edit flag offensive delete link more

Question Tools

2 followers

Stats

Asked: 2021-06-30 04:43:34 -0500

Seen: 252 times

Last updated: Jul 12 '21