Ask Your Question
3

Output of robot_pose_ekf jumps uncontrollably [closed]

asked 2014-04-03 04:07:04 -0500

Simon Harst gravatar image

updated 2016-10-24 09:02:15 -0500

ngrennan gravatar image

I would like to fuse data from a magnetometer, wheel odometry and a GPS sensor using the robot_pose_ekf. Since I would like to take advantage of the fact that both magnetometer and GPS are defined within an absolute reference frame, I switched to a fork by clearpath robotics which is publicly available.

I publish my wheel odometry on topic /odom:

        dat = tf.transformations.quaternion_from_euler(0, 0,self.th)
        quaternion = Quaternion(dat[0],dat[1],dat[2],dat[3])
        odom = Odometry()
        odom.header.stamp = now
        odom.header.frame_id = "odom_combined"
        odom.pose.pose.position.x = self.x
        odom.pose.pose.position.y = self.y
        odom.pose.pose.position.z = 0
        odom.pose.pose.orientation = quaternion
        odom.pose.covariance= [0.000001, 0, 0, 0, 0, 0, 
                               0, 0.000001, 0, 0, 0, 0,    
                               0, 0, 0.000001, 0, 0, 0,   
                               0, 0, 0, 0.000001, 0, 0, 
                               0, 0, 0, 0, 0.000001, 0, 
                               0, 0, 0, 0, 0, 0.000001]  
        odom.twist.covariance = [99999, 0, 0, 0, 0, 0, 
                                 0, 99999, 0, 0, 0, 0, 
                                 0, 0, 99999, 0, 0, 0, 
                                 0, 0, 0, 99999, 0, 0, 
                                 0, 0, 0, 0, 99999, 0, 
                                 0, 0, 0, 0, 0, 99999] 

        odom.child_frame_id = "base_footprint"
        self.odomPub.publish(odom)

My gps in topic /gps

        odom = Odometry()
        odom.header.stamp = rospy.Time.now()
        odom.header.frame_id = "odom_combined"
        (x, y) = self.converter.LatLong2UTM(self.lat, self.lon)
        odom.pose.pose.position.x = x
        odom.pose.pose.position.y = y
        odom.pose.pose.position.z = 0

        dat = tf.transformations.quaternion_from_euler(0, 0, 0)
        odom.pose.pose.orientation = Quaternion(dat[0],dat[1],dat[2],dat[3])

        odom.child_frame_id = "base_footprint"
        odom.pose.covariance =[1, 0, 0, 0, 0, 0,  # covariance on gps_x
                               0, 1, 0, 0, 0, 0,    # covariance on gps_y
                               0, 0, 1, 0, 0, 0,    # covariance on gps_z
                               0, 0, 0, 99999, 0, 0,  # large covariance on rot x
                               0, 0, 0, 0, 99999, 0,  # large covariance on rot y
                               0, 0, 0, 0, 0, 99999]  # large covariance on rot z
        odom.twist.covariance = [99999, 0, 0, 0, 0, 0, 
                                 0, 99999, 0, 0, 0, 0, 
                                 0, 0, 99999, 0, 0, 0, 
                                 0, 0, 0, 99999, 0, 0, 
                                 0, 0, 0, 0, 99999, 0, 
                                 0, 0, 0, 0, 0, 99999] 
        self.pub.publish(odom)

and my compass data in /imu_data

    im = Imu()
    ori = tf.transformations.quaternion_from_euler(0, 0, math.radians(bearing))
    qua = Quaternion(ori[0],ori[1],ori[2],ori[3])
    im.header.frame_id = "imu_data"
    im.header.stamp = rospy.Time.now()
    im.orientation = qua
    im.orientation_covariance = [0.000001, 0, 0,  
                                 0, 0.000001, 0,  
                                 0, 0, 0.000001]  

    # Switch others off for now.
    im.angular_velocity_covariance = [99999, 0, 0,
                                      0, 99999, 0,
                                      0, 0, 99999] 


    im.linear_acceleration_covariance = [99999, 0, 0,
                                         0, 99999, 0,
                                         0, 0, 99999]     

    self.imupub.publish(im)

I've been toying around with the covariance values a lot, but they don't seem to have any influence on my particular problem..

Furthermore, I've had problems with robot_pose_ekf giving me transformation errors, even when I ... (more)

edit retag flag offensive reopen merge delete

Closed for the following reason the question is answered, right answer was accepted by Simon Harst
close date 2014-04-08 01:20:59

2 Answers

Sort by ยป oldest newest most voted
2

answered 2014-04-06 11:44:48 -0500

Tom Moore gravatar image

I haven't formally announced this yet, but you might want to try this:

http://wiki.ros.org/robot_localization

It will let you choose to only use parts of your sensor messages.

edit flag offensive delete link more

Comments

Wow, that looks really cool on first glance! Will look into it promptly and report back..

Simon Harst gravatar image Simon Harst  ( 2014-04-06 20:13:47 -0500 )edit

Kudos! Works perfectly out of the box. One more question, though: If I understand your code correctly, the filter gets initialized with a zero value and is then updated differentially. There doesn't happen to be a way of initializing the pose with my gps and the orientation with the magnetometer?

Simon Harst gravatar image Simon Harst  ( 2014-04-06 22:51:14 -0500 )edit

Ha! It's one of the very first things on my list. I will let you know when it's implemented. I plan to parameterize the differential updating.

Tom Moore gravatar image Tom Moore  ( 2014-04-07 03:14:13 -0500 )edit

In the meantime, you can throw together a quick callback for your GPS and magnetometer and use them to manually set the initial state of your robot by issuing a PoseWithCovarianceStamped message to the set_pose topic. It's there to allow manual resets of the entire state.

Tom Moore gravatar image Tom Moore  ( 2014-04-07 03:40:37 -0500 )edit

You, kind Sir, are a lifesaver.

Simon Harst gravatar image Simon Harst  ( 2014-04-07 20:05:56 -0500 )edit

Updated. I added new launch file parameters that let you control which variables from each sensor are handled differentially. If you have any issues/feedback, feel free to contact me via email (available on my Github account).

Tom Moore gravatar image Tom Moore  ( 2014-04-08 01:03:48 -0500 )edit
1

answered 2014-04-03 05:24:42 -0500

demmeln gravatar image

updated 2014-04-04 04:01:08 -0500

Are you sure that the transformation between the imu_data frame and base_footprint is correct. In particular the rotation is crucial. I'm also not sure if the switching off of angular velocity and linear acceleration does what you think it does. robot_pose_ekf is not a general purpose ekf but was written for a specific purpose for the PR2 robot. I have no clue about the clearpath fork though.

Edit: I thought I had updated my answer before, but it seems it got lost...

I strongly suggest to look at previous questions surrounding robot_pose_ekf here on answers.ros.org . It has been discussed previously what robot_pose_ekf does and when it doesn't. In particular the way it handles covariances is not neccessarily what you expect. And it probably is also best if you check a little bit in the code to see what parts of your input messages is read and how it is used.

edit flag offensive delete link more

Comments

Thank you very much for your reply. The IMU has no effect on the jumping. In fact, I can switch it completely off and still the jumping remains. Is there any other way I can signal to the filter that I don't have data for angular vel. and linear acc. ?

Simon Harst gravatar image Simon Harst  ( 2014-04-03 20:23:01 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2014-04-03 04:07:04 -0500

Seen: 481 times

Last updated: Apr 06 '14