ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Hokuyo Lidar: Sensor drift during rotation

asked 2020-01-02 04:44:29 -0500

molenzwiebel gravatar image

updated 2020-01-02 04:53:47 -0500

Hey there. I'm running into a weird issue where I'm not sure whether to blame hardware or software (and if software, exactly what component). I have a simple differential drive robot with a Hokuyo URG-04LX-UG01 lidar mounted on the front. The odometry of the robot is provided using a fusion of an onboard IMU and encoder readings, and seems to be fairly accurate. The lidar is hooked up through urg_node with a TF frame that is appropriately offset from the base_link. I've set up RViz to visualize the robot model in the odom frame, then added the appropriate LaserScan topic.

If the robot drives forwards or backwards, the LaserScan stays fairly fixed in place (confirmed by increasing the decay of the topic in RViz). The same goes for constant rotating motion. However, when the robot first starts accelerating in a rotation, the laser scan gains a considerable rotational drift in the same direction and retains that drift until the robot stops moving. Once it stops moving, it recovers back to it's original normal position.

I've captured this video that shows in more detail what I mean. As you can see the scan rotates quite a bit more than the robot when it first starts moving, but it retains that offset for the entirety of the rotation. Once the robot stops, the rotation drift recovers. At the end of the video I move for a bit longer, showing that the rotational drift stays constant even during longer rotation (only really drifting when the robot starts accelerating or decelerating). You can also see that the robot model (powered by the fused odometry) has a constant rotation speed, indicating that it is not the TF frame that is abruptly changing.

The same drift happens very _very_ slightly when moving forwards/backwards (to the point where it is almost unnoticable), but obviously in the forward/backward direction. The drift is not an issue if the robot rotates very slowly.

Is this an inherent hardware issue (caused by the rotation interfering with the rotation of the mirror within the lidar) or is this an odometry error on my end that only manifests during acceleration/deceleration? The drifts end up causing lots of issues with AMCL later on, which in turn makes navigation iffy.

edit retag flag offensive close merge delete

Comments

Orhan gravatar image Orhan  ( 2020-01-02 06:48:40 -0500 )edit

I am having the same issue I think -> https://drive.google.com/drive/folder...

Did you find a solution in the end ?

NiamhD gravatar image NiamhD  ( 2020-10-07 08:53:27 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
0

answered 2020-01-03 11:14:25 -0500

achille gravatar image

updated 2020-01-03 11:19:21 -0500

This could be normal. You are using two noisy measurements and a fusing algorithm that's likely prone to sudden changes as well as slightly misaligned sensors. While your wheel odometry is reporting one thing, your accelerometer might be saying something else. The fusion isn't perfect and you get an offset. This also explains why the laser scans are 'reset' when the robot decelerates. Try quickly accelerating and slowly decelerating. It should stay offset. You could also get a sense of the quality of your odometry sources by checking what the results are using each raw sensor. If one of the sensors exhibits similar behavior, you could try weighting the other sensor more.

The fact that it deviates more when rotating could point to bad extrinsic calibration, specifically the translation component. Try playing with the tf transformations you set between IMU and your base_link to be closer and farther away and see if that affects thing. Distance between wheels could also affect rotation.

Whatever you do, getting odometry to be perfect is impossible. When your robot drives over uneven terrain, one wheel hits carpet before the other, or your robot becomes decalibrated during runtime, you'll get all sorts of faulty values. That's why the complementary sensors like your lidar should compensate for this with AMCL which has plenty knobs to play with to tune it to your specific robot. A particle filter will do just fine with the odometry you show in your video. Other packages also exist if you have stricter requirements.

Another potential source could be publishing lag or even visualization lag. Check delays and publish rates on each of your sensors to make sure your filter gets values frequently enough to spit out updated values using rostopic hz and rostopic delay.

edit flag offensive delete link more

Question Tools

2 followers

Stats

Asked: 2020-01-02 04:44:29 -0500

Seen: 962 times

Last updated: Oct 07 '20