1

I would like to track humans' path with an (later with many) Asus Xtion looking down from the ceiling.

OpenNI's sample program called UserTracker (which uses the User Generator node) would be perfect, if it recognized bodies from top view, but it can't. Because I don't need exact skeleton tracking, just to track basically any moving objects on the screen, I guess there's an easier way for this than rewrite the User Generator's recognition, as I saw it would be tough.
Maybe the Scene Analyzer node would be good for this, but I don't know how.

So anybody have an idea where should I start, which OpenNI classes should I use and how?
I searched a lot about this and got into OpenNI as I can, but I'm kinda new to it and it's so deep.

Thanks, Ts.

Tsirke
  • 13
  • 5

3 Answers3

1

OpenNI: For learning how to use OpenNI, take a look at OpenNI Cookbook, and if you don't want to buy the book, you still can use the source codes implemented in the book here.

OpenNI + OpenCV: If you want to use OpenNi and OpenCV together, you can follow this documentation. If you want a sample code, you can use this source code.

Labeling + Tracking: For labeling and tracking the objects (in your case bodies), there are lots of implementations available online. As an example, for labeling, take a look at this thread. For tracking, you can use this example.

For both labeling and tracking together, you can use this one.

Community
  • 1
  • 1
Ghassem Tofighi
  • 381
  • 4
  • 16
0

OpenNI UserTracker is not designed to detect bodies from top view. It detects movement and then based on setting a threshold on depth value of "moving blob" and some internal algorithms classifies the moving object as body, but not from top view.

You can simply develop similar algorithm if you sure the objects moving beneath the camera are all human bodies. You can set a threshold on depth image and label different blobs as different human bodies, and track them. As an example, the problem could be solved as described here.

Ghassem Tofighi
  • 381
  • 4
  • 16
  • Thank you for the article, actually I've already had this idea and I've set the threshold on the depth map, so I have individual blobs representing the people. Maybe as I wrote down it seemed this is the harder part, but sorry, if it's a good way, then I just can't implement the next steps in OpenNI.I have the blobs, but I just can't recognize and track them. It would be easy to say to the UserGenerator that they are users as well, but I still don't know how to do that neither. Thank you for your help! – Tsirke Oct 06 '15 at 08:15
0

You can also take a look at OpenPTrack project. OpenPTrack is an open source project launched in 2013 to create a scalable, multi-camera solution for person tracking. It enables many people to be tracked over large areas in real time. It is designed for applications in education, arts, and culture, as a start

Ghassem Tofighi
  • 381
  • 4
  • 16
  • Thank you, it looks awesome, but I cannot use a sensor which takes RGB image due to privacy rights (that's why I'm using an Xtion Pro), but it seems to be necessary for using this project. Anyways, I've already implemented the tracking, got the individual blobs with coordinates while moving, and it's working great. Next step is perfectly connect multiple sensor's depth maps to cover the whole area, and implement some other smaller functions, like specified line crossing counting etc. Your links helped me a lot so far, so thank you again! – Tsirke Oct 29 '15 at 08:57