Ask Your Question
1

People perception modules: detectors and trackers

asked 2014-08-05 06:02:28 -0600

Chrissi gravatar image

Hello,

I am working on human-aware navigation and human-robot spatial interaction, therefore, I am looking into different people detectors. In our project we use a ROSyfied version of this detector: www.vision.rwth-aachen.de/publication... which is based on RGB-D data, gives quite reasonable results, and will be made public soon. However, to enhance this detection I am currently looking for other people detection methods (like laser based leg detection) which I can combine with our tracker. Since the detection and tracking is not really part of my work I am looking for existing ROS packages that can be easily installed and trained. Our set-up is comprised of hydro and Ubuntu 12.04 using a sick s300 and an asus xtion

So far I looked at David Lu's fork of the people_experimental: github.com/DLu/people stack, which apart from some compilation errors that are easy to fix, works out of the box but gives very bad results (only detects legs at distances <2m) which I think is because of the bad resolution of our laser (3cm in distance). Due to the non-existent or very well hidden documentation I have no idea how to retrain it with data collected from our robot. Any help on this would be greatly appreciated. Almost all of the other perception algorithms I found or which are mentioned on this site are either not catkinized or are not available as a hydro package. My main question to the community would therefore be: What other people detectors are available for hydro?

Any hints and suggestions would be greatly appreciated.

I am sorry if that question has been asked already. I could only find one similar question which exclusively listed things that do not seem to exist for hydro. If there is a similar thread I would appreciate if you could refer me to it.

Cheers, Christian

P.S.: Apparently my karma is insufficient to publish links. Sorry for the workaround.

edit retag flag offensive close merge delete

Comments

FYI, the leg detector from that fork has been merged into the main people repo, and is now available in the debs in hydro.

Dan Lazewatsky gravatar imageDan Lazewatsky ( 2014-08-05 17:33:33 -0600 )edit

Thanks Dan, I tried that as well but some of the launch files don't work (wrong paths to config files and included launch files) so I decided to check it out from github because that makes it easier to mend imo. Do you know if there is a more detailed documentation on the usage of the leg_detector than this: http://wiki.ros.org/leg_detector?distro=hydro ?

Chrissi gravatar imageChrissi ( 2014-08-06 04:02:32 -0600 )edit

If something isn't working, please submit a ticket in the issue tracker so we can get it fixed. I'm not aware of any more detailed documentation, but @David Lu might be able to help.

Dan Lazewatsky gravatar imageDan Lazewatsky ( 2014-08-06 09:26:36 -0600 )edit

Don't get me wrong, the leg_detector works fine. It is just some of the other components as discussed here: http://answers.ros.org/question/78026/problem-with-leg_detector-and-people_tracking_filter/

Chrissi gravatar imageChrissi ( 2014-08-06 10:00:14 -0600 )edit

You said some of the launch files don't work - I was referring to that.

Dan Lazewatsky gravatar imageDan Lazewatsky ( 2014-08-06 10:30:17 -0600 )edit

Ah, OK. Sorry for the confusion. Never used the issue tracker. A link would be nice. Thanks.

Chrissi gravatar imageChrissi ( 2014-08-06 11:33:01 -0600 )edit

https://github.com/wg-perception/people/issues/new

Dan Lazewatsky gravatar imageDan Lazewatsky ( 2014-08-06 12:15:34 -0600 )edit

4 Answers

Sort by ยป oldest newest most voted
2

answered 2015-04-10 05:59:25 -0600

timm gravatar image

updated 2015-04-10 06:04:18 -0600

https://github.com/spencer-project/sp...

This Github repository contains people and group detection and tracking components developed during the EU FP7 project SPENCER for 2D laser, camera and RGB-D data. As the project is still going on, new detection and tracking modules and documentation will still be added during the next 12 months.

Our laser detector (reimplementation of the boosted classifier using laser segment features from Arras et al., ICRA'07) is trained on data from an LMS 200 and LMS 500 at around 70 cm height above ground, and it works at ranges up to 15-20 meters, though precision drops at larger distances.

In general, detection results are of course much better when visual data is available (esp. in complex environments). We have also integrated the upper-body RGB-D detector and groundHOG detectors by RWTH Aachen by Jafari et al. mentioned in the original question, as well as the RGB-D people detector from PCL.

All components are tested on ROS Hydro and Indigo. There is also a set of reusable RViz plugins for visualizing the outputs of the perception pipeline.

edit flag offensive delete link more
1

answered 2014-08-06 18:09:50 -0600

paulbovbel gravatar image

http://pointclouds.org/documentation/...

Have used this, works quite well for basic detection. PCL also has a GPU (CUDA) based person tracking module, but I haven't played around with that.

edit flag offensive delete link more
1

answered 2014-08-06 14:39:44 -0600

David Lu gravatar image

Unfortunately, I know of no way to retrain the algorithm. I am not the author of the leg_detector, and from what I understand, the original data with which the algorithm was trained has been lost to the ages. I am happy to help where I can.

edit flag offensive delete link more

Comments

Thank you for the offer. I might come back to that.

Chrissi gravatar imageChrissi ( 2014-08-07 05:24:06 -0600 )edit
0

answered 2014-08-05 17:10:26 -0600

ahubers gravatar image

http://wiki.ros.org/openni_tracker Haven't tried this out personally, but this package is in hydro and fits the ticket.

edit flag offensive delete link more

Comments

Thank you for the tip! The problem with this tracker is that you have to assume the infamous psi pose in order to calibrate it and start tracking. I used this in previous projects (not ROS based) but sadly this is not very well suited for real world deployment of a robot because people would have to calibrate the tracker before interacting with it. I will have a look at the openni2_tracker though, which, rumor has it, does not need this pose for calibration.

Chrissi gravatar imageChrissi ( 2014-08-06 04:08:19 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

4 followers

Stats

Asked: 2014-08-05 05:56:26 -0600

Seen: 1,947 times

Last updated: Apr 10 '15