ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A
Ask Your Question

How to access and use skeleton joint coordinates using OpenNI, ASUS Xtion, and ROS?

asked 2016-04-22 03:43:07 -0600

gokay gravatar image

To develop a person follower robot, I am using ASUS Xtion and OpenNI. To obtain both RGB image and skeleton joints, I am using a skeleton tracker script ( ). Tracker publishes joints in "/tf" But the thing is that I cannot use those joint coordinates in my script. I don't know how to access them. How can I access and use them in my script to make the robot move according to those coordinates? Thanks.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2016-04-22 07:30:58 -0600

updated 2016-04-22 07:35:54 -0600

The information you are interested in that is contained in the /tf data will be transformations from some frame on the Kinect to some frame on the user (e.g. right_hand). What you need to do is write a node that creates a tf listener that periodically looks up this transform data. Here are the tutorials on tf listeners in Python and C++. Once you have that transformation information, it's up to you to decide what to do with it. For example, you could publish the relevant geometric information on a new topic that a robot control node subscribes to. Or you could directly control the robot in the node with the tf listener.


The above links aren't working (seems like a bug in Askbot or Askbot config)... they work in my preview window. Here are the links:




Fixed links using answer from @joq here.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower


Asked: 2016-04-22 03:43:07 -0600

Seen: 2,688 times

Last updated: Apr 22 '16