To develop a person follower robot, I am using ASUS Xtion and OpenNI. To obtain both RGB image and skeleton joints, I am using a skeleton tracker script (https://github.com/Chaos84/skeleton_tracker). Tracker publishes joints in "/tf" But the thing is that I cannot use those joint coordinates in my script. I don't know how to access them. How can I access and use them in my script to make the robot move according to those coordinates? Thanks.
Asked
Active
Viewed 1,276 times
2 Answers
1
To get joint coordinates and angles from a /tf topic, you need to right a tf listener which is explained in this link.
Also you can look at one of my ROS packages where I wrote tf listener using OpenNI and ASUS Xtion. Here is the link.

robowolf
- 421
- 4
- 5
-
Thank you for your answer. In fact, I had looked before I posted but the names of the signals were different than the document that they are explained. After I have found the right names for them, I could accomplish it by using the method in the link that you give. Thanks! – Gokay Apr 28 '16 at 09:24
0
You can use another skeleton detection/tracker, the BodySkeletonTracker:
https://github.com/derzu/BodySkeletonTracker
Look how does it works:
You can get the joints points getting an object of the class SkeletonPoints.

Derzu
- 7,011
- 3
- 57
- 60