ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

How to access and use skeleton joint coordinates using OpenNI, ASUS Xtion, and ROS?

asked 2016-04-22 03:43:07 -0500

gokay gravatar image

To develop a person follower robot, I am using ASUS Xtion and OpenNI. To obtain both RGB image and skeleton joints, I am using a skeleton tracker script ( https://github.com/Chaos84/skeleton_t... ). Tracker publishes joints in "/tf" But the thing is that I cannot use those joint coordinates in my script. I don't know how to access them. How can I access and use them in my script to make the robot move according to those coordinates? Thanks.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2016-04-22 07:30:58 -0500

updated 2016-04-22 07:35:54 -0500

The information you are interested in that is contained in the /tf data will be transformations from some frame on the Kinect to some frame on the user (e.g. right_hand). What you need to do is write a node that creates a tf listener that periodically looks up this transform data. Here are the tutorials on tf listeners in Python and C++. Once you have that transformation information, it's up to you to decide what to do with it. For example, you could publish the relevant geometric information on a new topic that a robot control node subscribes to. Or you could directly control the robot in the node with the tf listener.

EDIT

The above links aren't working (seems like a bug in Askbot or Askbot config)... they work in my preview window. Here are the links:

Python: http://wiki.ros.org/tf/Tutorials/Writ...

C++: http://wiki.ros.org/tf/Tutorials/Writ...

EDIT 2

Fixed links using answer from @joq here.

edit flag offensive delete link more

Question Tools

1 follower

Stats

Asked: 2016-04-22 03:43:07 -0500

Seen: 6,659 times

Last updated: Apr 22 '16