ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Mr.Robot's profile - activity

2017-10-26 02:18:36 -0500 received badge  Famous Question (source)
2017-05-23 22:46:49 -0500 received badge  Notable Question (source)
2017-04-21 17:41:41 -0500 received badge  Popular Question (source)
2017-04-08 03:41:28 -0500 received badge  Enthusiast
2017-04-05 20:28:00 -0500 received badge  Supporter (source)
2017-04-01 02:18:38 -0500 asked a question Problem with TurtleBot 2 when trying to realize odometry with IMU

Hi

The first time to ask a question here. So the question may seem to be kinda silly.

I am now doing my undergraduate project in order for graduation(without which I could not graduate from my university). And I try to make precise odometry through the fusion of information from IMU and Kinect.

When I have already finished programming the odometry with IMU. I find that in TurtleBot 2, I could directly acquire the position, orientation and covariance via the subscription to the topic /odom. So it seems that I do not even need to use my own IMU odometry algorithm to implement this, which is pretty frustrating.

So

  1. How could I use my own IMU odometry in TurtleBot 2. I have come up a plan, which is to send the x, y and yaw to /odomtopic, and then lookupTransform("/odom", "/base_footprint", rospy.Time.now()), but I think there will be conflict between the pose I calculated and the one which is auto-generated in the Kobuki base.

  2. My supervisor say that I need to do something creative and therefore he thinks I should, aside from using robot_pose_ekf, further fuse the sensor information using IMM algorithm, which I think I could not realize. So any working examples about this IMM algorithm?

Thanks