ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Particle Filter for multiple target tracking

asked 2016-07-26 11:56:00 -0600

xlightx gravatar image

Hi! I'm now working on a human-aware navigation problem in which I need to have a good people tracker from several sensors onboard and offboard. I have a pretty good detector already that gives me a bounding box that I can use and discover the position of the person on the world frame.

From this point on I need to implement a tracker that uses this position as input each new frame of the detector. I used already the Kalman Filter from OpenCv. After using this I thought that might be more interesting to use the Particle Filter, because it might make the data association from different sensors easier, if each sensor contributes with particles. Am I right? For the PF almost all the implementations that I checked were based on color features, not positions, for the tracker, meaning that the tracking is done on the image.

Anyone can help me with opinions on the Kalman Vs Particle on this matter? And anyone knows any implementation that might fit what I'm searching?

Thanks in advance.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2016-07-27 13:53:13 -0600

Chrissi gravatar image

updated 2016-07-27 13:55:28 -0600

Not really helping with the choosing but I implemented something like that: which is described here: and you can find a video here: (video comes with subtitles in case you don't like our robot's voice ;) )

What it does is taking different detectors and fusing them using Nearest Neighbour Joint Probabilistic Data Association and uses a Kalman Filter with a constant velocity model to track them. Detectors can be added by just putting them in a config file. They need to publish a PoseArray. If they don't there is tool that is able to make any kind of topic into a PoseArray as long as it publishes a message with a header and x,y,z in it.

The whole repo contains also two detectors. If you are just interested in the tracker, have a look at:

Also, this is available as Debian packages from our projects PPA server for Indigo under Ubuntu 14.04 64-bit. Instructions on how to set that up can be found here: afterwards it can be installed via apt-get install ros-indigo-bayes-people-tracker

Let me know if you want to give it a try and run into problems.

edit flag offensive delete link more


thank you very much for your answer! I'm going to try this :) I'll give you some feedback later!

xlightx gravatar image xlightx  ( 2016-07-29 04:21:16 -0600 )edit

Hello. Would it be possible to get you e-mail? I ran into some problem :)

xlightx gravatar image xlightx  ( 2016-08-01 10:21:34 -0600 )edit

I think it's easiest if you just open an issue in the repository. Me or one of the other maintainers will answer there.

Chrissi gravatar image Chrissi  ( 2016-08-01 11:02:12 -0600 )edit

Hello, do you know why they aren't just using a linear Kalman Filter, instead of the non-linear filters? It seems to me that they are using a simple linear motion model. Also they don't take the relative motion of the robot into account, which would justify using a non linear approach

stuggibo gravatar image stuggibo  ( 2019-04-18 05:54:38 -0600 )edit

The tracker used is more general purpose than just tracking 2D motion. I agree that for this task, a simple Kalman filter would suffice. The extended and unscented one do the same thing with sufficient speed given linear models so I honestly never bothered to add a simple Kalman filter. The robot motion is not taken into account as the tracking is done in the world coordinate frame.

Chrissi gravatar image Chrissi  ( 2019-04-18 06:24:48 -0600 )edit

thanks for the quick reply. Maybe the observation model is non-linear? I havent found it yet

stuggibo gravatar image stuggibo  ( 2019-04-18 06:28:04 -0600 )edit

It's a simple Cartesian model. The noise is defined here on a per detector basis:

Chrissi gravatar image Chrissi  ( 2019-04-18 06:35:03 -0600 )edit

here they are considering a bearing angle in line 227. This would make the model non linear. But I may be wrong as I haven't figured out how the library is structured..

Do you know if there is any official paper or documentation to this? The only thing I found is , where they say that they rely on the world coordinate frame, so they don't need to the the robots motion into account. But they don't give any more details

stuggibo gravatar image stuggibo  ( 2019-04-18 06:43:43 -0600 )edit

Question Tools

1 follower


Asked: 2016-07-26 11:56:00 -0600

Seen: 1,161 times

Last updated: Jul 27 '16