Ask Your Question
0

make a map error with Asus Xtion Pro Live

asked 2014-06-18 04:38:41 -0600

guigui gravatar image

Hi, I have a problem when I make a map with my asus xtion. http://zupimages.net/up/14/25/l08m.png

When I move my bot around a classroom " it's a square" , i have a problem because the robot create another way! what is the problem? Gyro? Odometry? My asus?

thanks for your help

edit retag flag offensive close merge delete

3 Answers

Sort by ยป oldest newest most voted
0

answered 2014-06-19 23:52:09 -0600

fergs gravatar image

I presume you are using gmapping to build a map -- right? If so, you're seeing the classic "hall shortening" problem. gmapping was designed for long-range (30m rated, often get points out to 80m) lasers, and so it assumes that the laser knows better than the odometry (this would be the case if you had a 30m laser). Unfortunately, with your 4m "laser" and a 35m hallway, you don't see the end of the hallway, and so each update of the gmapping filter it runs the scan matcher and decides that the most likely solution is that you have gone nowhere.

The 2008 paper How To Learn Accurate Grid Maps With a Humanoid describes in detail the "hall shortening" problem, with pretty pictures that explain it a whole lot better than I have here. It also proposes a solution based on the scan matcher only using the points which could have been previously seen (those that fall in unknown territory are discounted, thus avoiding the filter deciding that you have not moved when you in fact have). That enhancement is not currently part of the ROS version of gmapping.

To confirm that your odometry is mostly OK, you could pull up the robot in RVIZ and watch as you drive along, both in the odom and map frames as your fixed frame, and you will probably see that motion is "kinda correct" in odom, and "totally wrong" in map.

edit flag offensive delete link more

Comments

Thanks a lot for your reply! I had test with the 30m laser and his very better! The map is correct and the accuracy is very good. Thanks

guigui gravatar imageguigui ( 2014-06-25 23:55:29 -0600 )edit

In this case, mark this answer as "correct" and the question as closed :) .

ccapriotti gravatar imageccapriotti ( 2014-06-26 00:18:22 -0600 )edit
0

answered 2014-06-18 20:44:53 -0600

guigui gravatar image

updated 2014-06-18 20:51:55 -0600

Yes, I walk around in my classroom with my robot in the corridor. The corridor it's a square, but we can see on the picture there are no right lane and do not join themselves when making the whole tower.

  • The dimensions of the room are : 35m * 22m and the larger of corridor is 2m to 3m. My robot run in the middle of the corridor.
  • I think the texture of the wall is laminated stickers, and the color is orange and white.
  • For the odometry, I don't know how I collect the data (because i'm on internship and the project was already started, and my party is to install the lidar and kinect for make a map and autonomous navigation). But I have seen on the kobuki guide, the odometry is made probably by one wheel but i'm not so sure!
  • I try to make a map with the kinect and with the lidar, and after i confronte the results. I thinks the big problem is the odometry,because i have the same problem with the lidar. When I want to calibrate my odometry ( previously I posted another question) the command don't work. After I try to use Hector_slam for working without odometry but, I don't no why, I can't make a map because the data are superimposed.

Can you help me,I do not really know or looking to solve this problem, what should I do?

Thank's a lot for your fast and very interresting reply because I'm not asking all these questions!

edit flag offensive delete link more

Comments

Well, I thin k you've answered your own question here. You have several potential points that could be failing, and you suspect the odometer. You have to isolate each of those variables, and test individually. Start with the Odometer. Or, Test the range finder statically.

ccapriotti gravatar imageccapriotti ( 2014-06-19 08:07:35 -0600 )edit
0

answered 2014-06-18 07:48:56 -0600

ccapriotti gravatar image

updated 2014-06-18 09:45:47 -0600

I am really sorry for answering a question with another (bunch of) question(s), but I guess we need some more info.

First, you are stating that your classroom is square, but the map captured by your Asus is not. Is that right ?

  • How big is the room ? What are its dimensions ?
  • How far - in average - is the robot from the walls the moment you capture and process the readings ? Remember that "Knect-like" devices have an operating range (1.2 m to 3.5 m) and that the farther the object, the greater the error on your reading.

Now, from articles around the net, the Kinect technology uses IR "dots" that are projected by the IR emitter, and read by the IR camera. If those dots are small, the object is close, and if the dots are large, the object is far. BUT, this is based on light reflection, so, the color, texture, material of the wall can interfere a lot. Can you imagine the "damage" a cork wall could do to your readings ? Or a matt-black for that matter. A high-gloss paint can be a disaster as well.

Now, it all depends on your implementation, of course.

It could be odometry also. How do you collect you odometer data ? On a wheel ? If you have a robot with three wheels and two of them are motor-driven, but you read you odometry on one (motor-driven) wheel only, whenever you make a curve, one of the wheels travels a shorter distance, when comparing to the other. Additionally, there is wheel skidding, and odometer reading errors or rounding-up errors on the math that could have a snow ball effect on your readings.

So, here you have a nice set of theories. Let's see if they help.

edit flag offensive delete link more

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2014-06-18 04:38:41 -0600

Seen: 798 times

Last updated: Jun 19 '14