Particle Filter for multiple target tracking
Hi! I'm now working on a human-aware navigation problem in which I need to have a good people tracker from several sensors onboard and offboard. I have a pretty good detector already that gives me a bounding box that I can use and discover the position of the person on the world frame.
From this point on I need to implement a tracker that uses this position as input each new frame of the detector. I used already the Kalman Filter from OpenCv. After using this I thought that might be more interesting to use the Particle Filter, because it might make the data association from different sensors easier, if each sensor contributes with particles. Am I right? For the PF almost all the implementations that I checked were based on color features, not positions, for the tracker, meaning that the tracking is done on the image.
Anyone can help me with opinions on the Kalman Vs Particle on this matter? And anyone knows any implementation that might fit what I'm searching?
Thanks in advance.
Asked by xlightx on 2016-07-26 11:56:00 UTC
Answers
Not really helping with the choosing but I implemented something like that: https://github.com/strands-project/strands_perception_people which is described here: http://eprints.lincoln.ac.uk/17545/1/dondrup.pdf and you can find a video here: https://www.youtube.com/watch?v=zdnvhQU1YNo (video comes with subtitles in case you don't like our robot's voice ;) )
What it does is taking different detectors and fusing them using Nearest Neighbour Joint Probabilistic Data Association and uses a Kalman Filter with a constant velocity model to track them. Detectors can be added by just putting them in a config file. They need to publish a PoseArray. If they don't there is tool that is able to make any kind of topic into a PoseArray as long as it publishes a message with a header and x,y,z in it.
The whole repo contains also two detectors. If you are just interested in the tracker, have a look at: https://github.com/strands-project/strands_perception_people/tree/indigo-devel/bayes_people_tracker
Also, this is available as Debian packages from our projects PPA server for Indigo under Ubuntu 14.04 64-bit. Instructions on how to set that up can be found here: http://lncn.eu/strands afterwards it can be installed via apt-get install ros-indigo-bayes-people-tracker
Let me know if you want to give it a try and run into problems.
Asked by Chrissi on 2016-07-27 13:53:13 UTC
Comments
thank you very much for your answer! I'm going to try this :) I'll give you some feedback later!
Asked by xlightx on 2016-07-29 04:21:16 UTC
Hello. Would it be possible to get you e-mail? I ran into some problem :)
Asked by xlightx on 2016-08-01 10:21:34 UTC
I think it's easiest if you just open an issue in the repository. Me or one of the other maintainers will answer there.
Asked by Chrissi on 2016-08-01 11:02:12 UTC
Hello, do you know why they aren't just using a linear Kalman Filter, instead of the non-linear filters? It seems to me that they are using a simple linear motion model. Also they don't take the relative motion of the robot into account, which would justify using a non linear approach
Asked by stuggibo on 2019-04-18 05:54:38 UTC
The tracker used is more general purpose than just tracking 2D motion. I agree that for this task, a simple Kalman filter would suffice. The extended and unscented one do the same thing with sufficient speed given linear models so I honestly never bothered to add a simple Kalman filter. The robot motion is not taken into account as the tracking is done in the world coordinate frame.
Asked by Chrissi on 2019-04-18 06:24:48 UTC
thanks for the quick reply. Maybe the observation model is non-linear? I havent found it yet
Asked by stuggibo on 2019-04-18 06:28:04 UTC
It's a simple Cartesian model. The noise is defined here on a per detector basis: https://github.com/strands-project/strands_perception_people/blob/kinetic-devel/bayes_people_tracker/config/detectors.yaml
Asked by Chrissi on 2019-04-18 06:35:03 UTC
here they are considering a bearing angle in line 227. This would make the model non linear. But I may be wrong as I haven't figured out how the library is structured..
Do you know if there is any official paper or documentation to this? The only thing I found is https://ieeexplore.ieee.org/abstract/document/7487766 , where they say that they rely on the world coordinate frame, so they don't need to the the robots motion into account. But they don't give any more details
Asked by stuggibo on 2019-04-18 06:43:43 UTC
This is the paper describing the underlying tracker: N. Bellotto and H. Hu, “Computationally efficient solutions for tracking people with a mobile robot: an experimental evaluation of bayesian filters,” Autonomous Robots, vol. 28, no. 4, pp. 425–438, 2010
Asked by Chrissi on 2019-04-18 07:05:12 UTC
You were right! It'a all linear
Asked by stuggibo on 2019-04-18 07:09:10 UTC
Comments