ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | Q&A answers.ros.org

# convert imu to odom

I have implemented my odom node like this:

#!/usr/bin/env python
import rospy
from nav_msgs.msg import Odometry
import smbus
import math
from math import sin, cos, pi
import tf
from geometry_msgs.msg import Point, Pose, Quaternion, Twist, Vector3

# Register
power_mgmt_1 = 0x6b
power_mgmt_2 = 0x6c

value = (h << 8) + l
return value

if (val >= 0x8000):
return -((65535 - val) + 1)
else:
return val

def dist(a,b):
return math.sqrt((a*a)+(b*b))

def get_y_rotation(x,y,z):

def get_x_rotation(x,y,z):

bus = smbus.SMBus(1) # bus = smbus.SMBus(0) fuer Revision 1
address = 0x68       # via i2cdetect

# Aktivieren, um das Modul ansprechen zu koennen

def talker():
pub = rospy.Publisher('odom', Odometry, queue_size=10)
rospy.init_node('odom_node', anonymous=True)
start_time = rospy.Time.now()
current_time = rospy.Time.now()
last_time = rospy.Time.now()
x = 0.0
y = 0.0
th = 0.0
rate = rospy.Rate(10) # 10hz while not rospy.is_shutdown(): odom = Odometry()
while not rospy.is_shutdown():

accleration_xout_skaliert = accleration_xout / 16384.0
accleration_yout_skaliert = accleration_yout / 16384.0
accleration_zout_skaliert = accleration_zout / 16384.0
current_time = rospy.Time.now()

vx = accleration_xout_skaliert
vy = accleration_yout_skaliert
vth = gyroskop_zout

# compute odometry in a typical way given the velocities of the robot
dt = (current_time - last_time).to_sec()
delta_x = (vx * cos(th) - vy * sin(th)) * dt
delta_y = (vx * sin(th) + vy * cos(th)) * dt
delta_th = vth * dt

x += delta_x
y += delta_y
th += delta_th

# since all odometry is 6DOF we'll need a quaternion created from yaw
odom_quat = tf.transformations.quaternion_from_euler(0, 0, th)
# first, we'll publish the transform over tf
(x, y, 0.),
odom_quat,
current_time,
"odom"
)
# next, we'll publish the odometry message over ROS
odom = Odometry()

# set the position
odom.pose.pose = Pose(Point(x, y, 0.), Quaternion(*odom_quat))

# set the velocity
odom.twist.twist = Twist(Vector3(vx, vy, 0), Vector3(0, 0, vth))
pub.publish(odom)
rate.sleep()

if __name__ == '__main__':
try:
talker()
except rospy.ROSInterruptException:
pass


Here i have accessed the linear and angular velocity change with Mpu6050 imu sensor. Here is the data:

header:
seq: 1008
stamp:
secs: 0
nsecs:         0
frame_id: "odom"
pose:
pose:
position:
x: 238.075342952
y: 510.232761159
z: 0.0
orientation:
x: 0.0
y: 0.0
z: 0.528979737974
w: 0.848634454175
covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0 ...
edit retag close merge delete

Sort by ยป oldest newest most voted

Is your question about converting IMU readings to estimate the orientation & position or is it about sending the IMU data using ROS?

For the latter, you can use the ros msg called sensor_msgs/Imu.

As for the former, it is a state estimation problem. You can use standard AHRS algorithms like Mahony/Madgwick filters (link) to convert the sensor readings to estimate the orientation (quaternion). Note, IMU (accelerometer) readings alone will not suffice to estimate the position info accurately, you would need some kind of external feedback like encoders, images (for VIO) etc. But naively, you can integrate the (accelerometer readings - bias) to compute the velocity and position, but these values will drift very quickly no matter what.

more

Isn't the immediate problem in the code the OP shows the fact he appears to be using acceleration where it should be velocity?

( 2020-08-17 02:50:44 -0500 )edit

Ok. it looks like their is better option than to implement this all from scratch. I found that i can simply use the robot localization package and publish the sensor_msgs/imu data from one imu node which reads the imu data from the imu sensor.

( 2020-08-17 04:16:32 -0500 )edit

I guess it's not required any more but yes in addition to using acceleration instead of velocity begin a problem, the acceleration info is in the body-frame and has to be converted to world-frame before integrating it to obtain velocity and subsequently position.

( 2020-08-17 21:49:43 -0500 )edit