Ask Your Question
0

how to fuse (IMU + Odometry+ Laserscan) using robot_localization package

asked 2020-06-30 10:49:57 -0500

raphael_khang gravatar image

Hello, am working on real robot ( RB1 Robotnik robot) , am truying to use the robot_localization, using laser+odometry+Imu, but i have some difficulty to add the laser information because it give me position (x,y,z) and orientation. also i wanna know, the position get from odometry it x,y,z but normaly for geting a good EKF i need to use the wheel encoder but i dont know where to take it and also how to use it with robot_localization. (i know my question it a bit complexe and difficult to undestand) please help me.

knetic, robot_localization , rb1, fuse_sensor_data, odometry, laserscan, Odometry, Imu, EKF

edit retag flag offensive close merge delete

Comments

To make it simple, could you express explicit your question? "How to" questions usually is a big question. Could you be specific? What exactly do you need?

Teo Cardoso gravatar image Teo Cardoso  ( 2020-06-30 11:27:50 -0500 )edit

thank you for the attention you gave to my question. Indeed, I have to use the robot_localization package to localize the RB1 robot using different sensors (imu,laserscan,odometry).

My question is : 1. how can I make the configuration of the robot_localization package? (I tried to do this by just taking the imu and the odometry (position, without taking into account the orientation which is in quaterion).

  1. how to transform the quaterion orientation into RPY ?.

  2. How to get the right markers for the configuration of the robot_localization? package, because in my case, the fixed marker is RB1_base_odom and the marker that moves with the robot is RB1_base_base_footprint or RB1_base_link.

  3. How could I know that the robot_localization package is working well and that the return position is good?
raphael_khang gravatar image raphael_khang  ( 2020-07-01 00:33:24 -0500 )edit

Quaternion to rpy ROS node link text However this might be outdated, I think to remember that long time ago I came across a quaternion to yaw function already implemented in some standard ros package.

Dragonslayer gravatar image Dragonslayer  ( 2020-07-03 09:56:36 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
0

answered 2020-06-30 12:07:54 -0500

Dragonslayer gravatar image

There are icp packages out there to get odometry from laserscan example, and then in the ekf sensor fusion node of robot_localization you can fuse them all. Also most slam and localization packages take in laserscan, I think rtabmap and cartographer do for "internal odometry?". And AMCL takes laserscan for localization as well. The usual way is to fuse odometry and IMU in ekf_localization and laserscan for slam, or mapping and localization seperated (gmapping and amcl together as in most basic tutorials).

edit flag offensive delete link more

Comments

thank you so much for the lightning bolt. 1. Will there be a possibility to retrieve instead the information (left and right wheel speeds, angular speeds from the wheels to estimate alone the pose (x,y,z,theta) in order to use it in the robot_localization package?

  1. Could you also help me to send positions to my robot (example: moving on 4 points to make a square [0,0 1,0 1,1 0,1] just by sending positions and orientations and not speeds on the wheels).
raphael_khang gravatar image raphael_khang  ( 2020-07-01 09:22:16 -0500 )edit

Iam not sure if I understand the question. But wheel speeds to pose is exactly what a classic odometry node does. It translates encoder readings to pose (and) velocitys etc. the turtlebot uses diffdrive for example, code is on github. And then there are lots of others, rp2 uses four wheel steering also on github and so on.

Your next question seems to be about the opposite case, generating velocity commands for the wheel controllers. This is done by a base_controller node (most of the time a base_controller the base controller produces odometry data as well), they are as well in the packages on github for example. For diff_drive I think there is even a ros_controller straight out of the box odometry included. A base_controller takes abstract commands x, y linear and angular z (2d robot) and interprets it to the mechanical realities of the robot_base. Generating motorcommands for wheels ...(more)

Dragonslayer gravatar image Dragonslayer  ( 2020-07-03 09:46:18 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2020-06-30 10:47:58 -0500

Seen: 64 times

Last updated: Jun 30