ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Maybe you can get the coordinates(relative to the sensor) of the detected user by utilizing the transform from openni_depth_frame to skeleton point like head_1, left_hand_1 and so on. Depending on the coordinates, you can get the distance from the user to the robot through the X coordinate and whether the robot is on the left or the right of the user through the Y coordinate. Then you can make the robot move right or left or faster according to the above info.