how to fuse (IMU + Odometry+ Laserscan) using robot_localization package
Hello, am working on real robot ( RB1 Robotnik robot) , am truying to use the robot_localization, using laser+odometry+Imu, but i have some difficulty to add the laser information because it give me position (x,y,z) and orientation. also i wanna know, the position get from odometry it x,y,z but normaly for geting a good EKF i need to use the wheel encoder but i dont know where to take it and also how to use it with robot_localization. (i know my question it a bit complexe and difficult to undestand) please help me.
knetic, robot_localization , rb1, fuse_sensor_data, odometry, laserscan, Odometry, Imu, EKF
To make it simple, could you express explicit your question? "How to" questions usually is a big question. Could you be specific? What exactly do you need?
thank you for the attention you gave to my question. Indeed, I have to use the robot_localization package to localize the RB1 robot using different sensors (imu,laserscan,odometry).
My question is : 1. how can I make the configuration of the robot_localization package? (I tried to do this by just taking the imu and the odometry (position, without taking into account the orientation which is in quaterion).
how to transform the quaterion orientation into RPY ?.
How to get the right markers for the configuration of the robot_localization? package, because in my case, the fixed marker is RB1_base_odom and the marker that moves with the robot is RB1_base_base_footprint or RB1_base_link.
Quaternion to rpy ROS node link text However this might be outdated, I think to remember that long time ago I came across a quaternion to yaw function already implemented in some standard ros package.