ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

How to convert Laser Scans from frame 'laser' to frame 'world' using IMU data (Roll, Pitch, yaw and Linear Acceleration)?

asked 2021-01-27 23:03:00 -0500

Mateen gravatar image

Hi All,

I am doing a ROS based project equipped with 2D SICK LiDAR, IMU (Razor 9DOF), and GPS. First I want to convert the Lidar data which is in Lidar frame 'laser' to IMU frame 'base_imu_link' using static transformation. I have tried that and somehow able to do that. Now I want to convert my laser scan point cloud from frame 'laser' to frame 'world' by dynamic transformation using IMU data. The transformation Matrix should be comprised of a Rotation matrix and Translation vector. Rotation Matrix can be found by roll, pitch, yaw and translation can be found by double intergrating the linear acceleration, coming from IMU.

I have this initial concept in my mind. But Due the fact that, I am a beginner to ROS, I am unable to code that in cpp/python. Kindly, Help me in this regards!

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
1

answered 2021-01-29 08:15:24 -0500

JackB gravatar image

Before you go any further review REP-105 so you have a full understanding of the frame structure that mobile robots in ROS use.

It sounds like you understand the static transforms "internal" to the robot, e.g. base_link -> laser. Now you need to get odom -> base_link and then map -> odom.

If I were you I would use wheel odometry (if you have it) and the IMU to get the odom -> base_link tf using the robot_localization package. The given example launch file does almost exactly that.

Next to get the map -> odom transform, given that you have a LIDAR I'd use something like gmapping or if you already have a map use amcl.

Getting this whole pipeline up and running can take weeks in my experience, but I wish you luck and happy learning! Glad to answer any questions or comments you have here.

Cheers!

edit flag offensive delete link more

Comments

Thankyou Dear for you guideline. But I am working on a USV/boat, with single propeller and single rudder. I have collected some data in the bag files. The 2D Sick Lidar views vertically upwards (90 degrees to the motion of boat). The boat is also equipped with IMU and GPS. However, Due to uncertainty bubble, GPS is of no use. I have successfully transformed 'laser' to 'base_link' using static transform. Now I want to derive the Odometry using IMU's linear acceleration. The rotation matrix from 'base_link' to 'world' can be constructed using roll pitch yaw and translation can be obtained by double integrating linear acceleration, we are getting from IMU. How can I do that? I am confused in this scenario. Kindly, Help me in this regard.

Mateen gravatar image Mateen  ( 2021-01-29 23:05:52 -0500 )edit

Use the robot_localization package I mentioned, it will allow you to integrate the IMU. They are so noisy though I doubt this will work, but I am eager to hear if it does.

JackB gravatar image JackB  ( 2021-01-30 10:11:43 -0500 )edit

Hey JackB, I have tried to use robot_localization package. But unable to understand that how can i have pose from IMU's linear acceleration. How can I do that? Please, guide me more specifically.

Mateen gravatar image Mateen  ( 2021-02-01 06:36:33 -0500 )edit

You can use the IMU like any other sensor, this question is not the place for a robot_localization tutorial, do you have a more specific comment relevant to the question?

JackB gravatar image JackB  ( 2021-02-01 08:31:44 -0500 )edit

I mean, Exactly where do I have to make changes to get Pose of boat from IMU's linear acceleration using robot_localizaiton?

Mateen gravatar image Mateen  ( 2021-02-01 08:54:13 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2021-01-27 23:03:00 -0500

Seen: 481 times

Last updated: Jan 29 '21