How to do simple pick motion with robot arm and 3d Camera?

asked 2023-04-08 09:13:04 -0500

TapleFlib gravatar image

updated 2023-04-08 11:39:22 -0500

  1. The project I'm currently working on : Control a Techman Robot arm to grab a hanging apple (the robot needs to know the x,y,z of the apple and move to the location)

  2. My progress : I added flange, tool, and camera model to urdf and the respective tf. I can control the real robot arm through moveit by dragging the blue ball on the moveit rviz interface.

  3. My plan and problems :

  4. I want to control the robot arm with python or cpp code
  5. I want to know the x,y,z of the apple with my 3D Camera either by using yolov5 or other image processing software , but I only need the camera pixel value of x,y,z (can I write this function and controlling robot arm in a single python or cpp file?)
  6. I want to convert and send the camera pixel value x,y,z to the robot arm (this is where I'm clueless what to do, I see many people says tf or tf2 can convert it, but how? the urdf tf only know the center of the camera model, not the exact position of the lens)

  7. Version and Platform : Ubuntu 20.04 , ROS noetic , Camera : Intel Realsense D455

  8. Techman Robot code link : https://github.com/TechmanRobotInc/tm...

I know it's a bit much but I really need help here, I'm stuck.

Thanks in advance !

edit retag flag offensive close merge delete

Comments

The camera must be publishing data with respect to a camera related frame of reference like camera_link. You don't need to worry about lens reference frame since the given camera link in the urdf is well measured. You just have to transform data from camera frame to robot frame like base_link frame and eventually end effector frame.

Anubhav Singh gravatar image Anubhav Singh  ( 2023-04-08 14:52:33 -0500 )edit

Also as far as getting the object's x,y,z is concerned use some computer vision algorithm to detect the centroid of the detected apple (x,y) and using this calculate the y value since corresponding to a (x,y) you will have a unique z data from your 3d camera.

Anubhav Singh gravatar image Anubhav Singh  ( 2023-04-08 15:01:36 -0500 )edit

@Anubhav Singh , Thanks for the reply ! ,

  1. can you give me a simple example of "The camera must be publishing data with respect to a camera related frame of reference like camera_link" ? I'm quite new to ROS and don't know how to implement it, is it by writing a python publisher? how to write the code ? can you give me a simple example ? or do you know any tutorial about this ?

  2. and for "You don't need to worry about lens reference frame since the given camera link in the urdf is well measured" , the camera_link is not from the company, I added it myself and the tf is on the center of the camera , is it still well measured ?

  3. Lastly "You just have to transform data from camera frame to robot frame like base_link frame and eventually end effector frame." Should I write a code for this? or ...

(more)
TapleFlib gravatar image TapleFlib  ( 2023-04-08 23:59:29 -0500 )edit
  1. for the first part, once go through this link.
  2. When you are opening 3D camera using ROS then check the camera topic using rostopic list. you can find all the relevant camera links.
  3. you can transform data from one frame to another using tf2. But you will have to first decide where you wanna mount the camera. If it is mounted on the robot and moving with it then add the proper measurement in the urdf itself(like distance between eef_link and camera_link, etc.). Else you can simply fix it somewhere else(here the camera is not moving with the robot). If there is no relative motion between camera and robot base, simply measure distance between camera_link position and robot base_link and put these values in the launch file. Check this for help.
  4. I don't get the question but you must be talking about this
Anubhav Singh gravatar image Anubhav Singh  ( 2023-04-09 05:02:28 -0500 )edit

For the last part you can use YOLO to get the centroid of the detected object and using that (x,y) get the z from the camera data. Create a message type of geometry_msgs/Pose (Quaternion being 0,0,0,1). send this pose to relevant ROS functions like move_group.computeCartesianPath() to complete your task.

Anubhav Singh gravatar image Anubhav Singh  ( 2023-04-09 05:13:40 -0500 )edit

1&2. Yes I have downloaded all the stuff and I can run it on rviz and I tried the examples, but can I run the topics and get the value in python code?

  1. Yes I decided where I mount the camera, it's mounted on the robot arm and I described it already in the urdf file and the tf is on the center back of the camera (not the lens) and the joint between camera and flange is fixed.

  2. Sorry if it's not clear, this is not a question, I just want to tell you that I already have the x,y coordinate value (for example 300,200 that I got from other image processing software)

TapleFlib gravatar image TapleFlib  ( 2023-04-09 08:35:55 -0500 )edit

yes something like this, although in your case since you are getting only x,y,z values, so your quat woud be 0,0,0,1

Anubhav Singh gravatar image Anubhav Singh  ( 2023-04-09 08:41:00 -0500 )edit

Today I made the following python code and succeeded in controlling the robot arm (I don't write import due to limited comment space) :

moveit_commander.roscpp_initialize(sys.argv)
rospy.init_node('move_group_python_interface_tutorial', anonymous=True)
arm_group = moveit_commander.MoveGroupCommander('tmr_arm')
current_pose = arm_group.get_current_pose().pose
print("current pose : \n" + str(current_pose))
base_link = "base"
end_effector_pose = geometry_msgs.msg.PoseStamped()
end_effector_pose.header.frame_id = base_link
end_effector_pose.pose.position.x = 1.2463066739522848e-05
end_effector_pose.pose.position.y = -0.24444999511970975
end_effector_pose.pose.position.z = 0.891700040374037
end_effector_pose.pose.orientation.x = 0.7071065460943615
end_effector_pose.pose.orientation.y = 3.364734868792732e-05 
end_effector_pose.pose.orientation.z = -3.0288478676105042e-05
end_effector_pose.pose.orientation.w = 0.707107010188150

....

TapleFlib gravatar image TapleFlib  ( 2023-04-09 08:44:34 -0500 )edit