How to recognize "fake" human hand in gazebo based on pointcloud?
ROS distro:melodic Linux distro:Ubuntu 18.04
I am running the simulation of PR2 and I need to recognize the human hand from the envoronment. The most straightfoward way is to use kinect on PR2 to get the pointcloud data and extract the pointcloud of hands then train them based on machine learning.
Now I could extract the sensot_msgs/PointCloud2 to a .pcd file(a standard file format for Point Cloud Library:PCL) by following the official tutorial:link text
But how to extract my interesting field that contains "human hand" data from such .pcd files.
Or any better idea about this virtual hand recognition problem in gazebo?
Here is the problem description by a picture. https://ibb.co/rwDtjVC