# Problem with TurtleBot 2 when trying to realize odometry with IMU

Hi

The first time to ask a question here. So the question may seem to be kinda silly.

I am now doing my undergraduate project in order for graduation(without which I could not graduate from my university). And I try to make precise odometry through the fusion of information from IMU and Kinect.

When I have already finished programming the odometry with IMU. I find that in TurtleBot 2, I could directly acquire the position, orientation and covariance via the subscription to the topic /odom. So it seems that I do not even need to use my own IMU odometry algorithm to implement this, which is pretty frustrating.

So

1. How could I use my own IMU odometry in TurtleBot 2. I have come up a plan, which is to send the x, y and yaw to /odomtopic, and then lookupTransform("/odom", "/base_footprint", rospy.Time.now()), but I think there will be conflict between the pose I calculated and the one which is auto-generated in the Kobuki base.

2. My supervisor say that I need to do something creative and therefore he thinks I should, aside from using robot_pose_ekf, further fuse the sensor information using IMM algorithm, which I think I could not realize. So any working examples about this IMM algorithm?

Thanks

edit retag close merge delete