How to convert Laser Scans from frame 'laser' to frame 'world' using IMU data (Roll, Pitch, yaw and Linear Acceleration)?
Hi All,
I am doing a ROS based project equipped with 2D SICK LiDAR, IMU (Razor 9DOF), and GPS. First I want to convert the Lidar data which is in Lidar frame 'laser' to IMU frame 'baseimulink' using static transformation. I have tried that and somehow able to do that. Now I want to convert my laser scan point cloud from frame 'laser' to frame 'world' by dynamic transformation using IMU data. The transformation Matrix should be comprised of a Rotation matrix and Translation vector. Rotation Matrix can be found by roll, pitch, yaw and translation can be found by double intergrating the linear acceleration, coming from IMU.
I have this initial concept in my mind. But Due the fact that, I am a beginner to ROS, I am unable to code that in cpp/python. Kindly, Help me in this regards!
Asked by Mateen on 2021-01-28 00:03:00 UTC
Answers
Before you go any further review REP-105 so you have a full understanding of the frame structure that mobile robots in ROS use.
It sounds like you understand the static transforms "internal" to the robot, e.g. base_link -> laser. Now you need to get odom -> base_link and then map -> odom.
If I were you I would use wheel odometry (if you have it) and the IMU to get the odom -> base_link tf using the robot_localization package. The given example launch file does almost exactly that.
Next to get the map -> odom transform, given that you have a LIDAR I'd use something like gmapping or if you already have a map use amcl.
Getting this whole pipeline up and running can take weeks in my experience, but I wish you luck and happy learning! Glad to answer any questions or comments you have here.
Cheers!
Asked by JackB on 2021-01-29 09:15:24 UTC
Comments
Thankyou Dear for you guideline. But I am working on a USV/boat, with single propeller and single rudder. I have collected some data in the bag files. The 2D Sick Lidar views vertically upwards (90 degrees to the motion of boat). The boat is also equipped with IMU and GPS. However, Due to uncertainty bubble, GPS is of no use. I have successfully transformed 'laser' to 'base_link' using static transform. Now I want to derive the Odometry using IMU's linear acceleration. The rotation matrix from 'base_link' to 'world' can be constructed using roll pitch yaw and translation can be obtained by double integrating linear acceleration, we are getting from IMU. How can I do that? I am confused in this scenario. Kindly, Help me in this regard.
Asked by Mateen on 2021-01-30 00:05:52 UTC
Use the robot_localization package I mentioned, it will allow you to integrate the IMU. They are so noisy though I doubt this will work, but I am eager to hear if it does.
Asked by JackB on 2021-01-30 11:11:43 UTC
Hey JackB, I have tried to use robot_localization package. But unable to understand that how can i have pose from IMU's linear acceleration. How can I do that? Please, guide me more specifically.
Asked by Mateen on 2021-02-01 07:36:33 UTC
You can use the IMU like any other sensor, this question is not the place for a robot_localization tutorial, do you have a more specific comment relevant to the question?
Asked by JackB on 2021-02-01 09:31:44 UTC
I mean, Exactly where do I have to make changes to get Pose of boat from IMU's linear acceleration using robot_localizaiton?
Asked by Mateen on 2021-02-01 09:54:13 UTC
Comments