ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

convert imu to odom

asked 2020-08-16 04:15:23 -0500

dinesh gravatar image

updated 2020-08-16 04:37:23 -0500

I have implemented my odom node like this:

#!/usr/bin/env python
# license removed for brevity
import rospy
from nav_msgs.msg import Odometry
import smbus
import math
from math import sin, cos, pi
import tf 
from geometry_msgs.msg import Point, Pose, Quaternion, Twist, Vector3

# Register
power_mgmt_1 = 0x6b
power_mgmt_2 = 0x6c

def read_byte(reg):
    return bus.read_byte_data(address, reg)

def read_word(reg):
    h = bus.read_byte_data(address, reg)
    l = bus.read_byte_data(address, reg+1)
    value = (h << 8) + l
    return value

def read_word_2c(reg):
    val = read_word(reg)
    if (val >= 0x8000):
        return -((65535 - val) + 1)
    else:
        return val

def dist(a,b):
    return math.sqrt((a*a)+(b*b))

def get_y_rotation(x,y,z):
    radians = math.atan2(x, dist(y,z))
    return -math.degrees(radians)

def get_x_rotation(x,y,z):
    radians = math.atan2(y, dist(x,z))
    return math.degrees(radians)

bus = smbus.SMBus(1) # bus = smbus.SMBus(0) fuer Revision 1
address = 0x68       # via i2cdetect

# Aktivieren, um das Modul ansprechen zu koennen
bus.write_byte_data(address, power_mgmt_1, 0)

def talker():
    pub = rospy.Publisher('odom', Odometry, queue_size=10)
    odom_broadcaster = tf.TransformBroadcaster()
    rospy.init_node('odom_node', anonymous=True)
    start_time = rospy.Time.now()
    current_time = rospy.Time.now()
    last_time = rospy.Time.now()
    x = 0.0
    y = 0.0
    th = 0.0
    rate = rospy.Rate(10) # 10hz while not rospy.is_shutdown(): odom = Odometry()
    while not rospy.is_shutdown():
        gyroskop_xout = read_word_2c(0x43)
        gyroskop_yout = read_word_2c(0x45)
        gyroskop_zout = read_word_2c(0x47)

        accleration_xout = read_word_2c(0x3b)
        accleration_yout = read_word_2c(0x3d)
        accleration_zout = read_word_2c(0x3f)

        accleration_xout_skaliert = accleration_xout / 16384.0
        accleration_yout_skaliert = accleration_yout / 16384.0
        accleration_zout_skaliert = accleration_zout / 16384.0
        current_time = rospy.Time.now()

        vx = accleration_xout_skaliert 
        vy = accleration_yout_skaliert
        vth = gyroskop_zout

        # compute odometry in a typical way given the velocities of the robot
        dt = (current_time - last_time).to_sec()
        delta_x = (vx * cos(th) - vy * sin(th)) * dt
        delta_y = (vx * sin(th) + vy * cos(th)) * dt
        delta_th = vth * dt

        x += delta_x
        y += delta_y
        th += delta_th

        # since all odometry is 6DOF we'll need a quaternion created from yaw
        odom_quat = tf.transformations.quaternion_from_euler(0, 0, th)
        # first, we'll publish the transform over tf
        odom_broadcaster.sendTransform(
            (x, y, 0.),
            odom_quat,
            current_time,
            "base_link",
            "odom"
        )
        # next, we'll publish the odometry message over ROS
        odom = Odometry()
        odom.header.frame_id = "odom"

        # set the position
        odom.pose.pose = Pose(Point(x, y, 0.), Quaternion(*odom_quat))

        # set the velocity
        odom.child_frame_id = "base_link"
        odom.twist.twist = Twist(Vector3(vx, vy, 0), Vector3(0, 0, vth))
        pub.publish(odom)
        rate.sleep()

if __name__ == '__main__':
    try:
        talker()
    except rospy.ROSInterruptException:
        pass

Here i have accessed the linear and angular velocity change with Mpu6050 imu sensor. Here is the data:

header: 
  seq: 1008
  stamp: 
    secs: 0
    nsecs:         0
  frame_id: "odom"
child_frame_id: "base_link"
pose: 
  pose: 
    position: 
      x: 238.075342952
      y: 510.232761159
      z: 0.0
    orientation: 
      x: 0.0
      y: 0.0
      z: 0.528979737974
      w: 0.848634454175
  covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0 ...
(more)
edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
3

answered 2020-08-16 16:34:30 -0500

praskot gravatar image

updated 2020-08-16 17:22:40 -0500

Is your question about converting IMU readings to estimate the orientation & position or is it about sending the IMU data using ROS?

For the latter, you can use the ros msg called sensor_msgs/Imu.

As for the former, it is a state estimation problem. You can use standard AHRS algorithms like Mahony/Madgwick filters (link) to convert the sensor readings to estimate the orientation (quaternion). Note, IMU (accelerometer) readings alone will not suffice to estimate the position info accurately, you would need some kind of external feedback like encoders, images (for VIO) etc. But naively, you can integrate the (accelerometer readings - bias) to compute the velocity and position, but these values will drift very quickly no matter what.

edit flag offensive delete link more

Comments

Isn't the immediate problem in the code the OP shows the fact he appears to be using acceleration where it should be velocity?

gvdhoorn gravatar image gvdhoorn  ( 2020-08-17 02:50:44 -0500 )edit

Ok. it looks like their is better option than to implement this all from scratch. I found that i can simply use the robot localization package and publish the sensor_msgs/imu data from one imu node which reads the imu data from the imu sensor.

dinesh gravatar image dinesh  ( 2020-08-17 04:16:32 -0500 )edit

I guess it's not required any more but yes in addition to using acceleration instead of velocity begin a problem, the acceleration info is in the body-frame and has to be converted to world-frame before integrating it to obtain velocity and subsequently position.

praskot gravatar image praskot  ( 2020-08-17 21:49:43 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2020-08-16 04:15:23 -0500

Seen: 3,860 times

Last updated: Aug 16 '20