ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

How to deal with an offset in precise GPS data?

asked 2014-06-24 06:59:36 -0500

Simon Harst gravatar image

I've got a robot setup with a very precise differential GPS (centimeter level). For technical reasons, I can't attach the GPS directly over the rotational center of the robot, but rather need to make an offset in both x- and y-direction.

There are two things I'd like to do:

First, infer the orientation of the robot from the measurements of the GPS

Secondly, infer the position of the robot on the UTM grid from the position of the GPS

Obviously, the first task is the harder one. Once it is solved, the second one is trivial.

image description

I hope the picture illustrates the situation a bit. My robot has a differential drive, so the rotational center is located between the two wheels. My GPS is attached to a rod with offset delta_x and delta_y (in the picture, delta_x is negative). The green dots stand for the three last GPS measurements that were recorded. What I'd like to infer is angle alpha.

From what I've tried so far, I gather that there is no single solution if I take only the last two measurements into account. Therefore, I am thinking about using the last three measurements. They lie on exactly one circle with midpoint M. This means that my robot is currently moving on a circle with the same midpoint.

Apparently, my geometry skills simply aren't good enough - I've not been able to compute a solution. The solution is fairly simple if either delta_x or delta_y are equal to zero - but with both values != 0, I don't get anywhere.

Can anybody give me a hint whether this problem is solvable and how you'd go about solving it? On first glance it sounds like a fairly common usage of a GPS.

edit retag flag offensive close merge delete

2 Answers

Sort by » oldest newest most voted
1

answered 2014-06-28 08:48:19 -0500

Have a look at this blog. It appears the author there describes in great detail how to solve your exact problem using an Extended Kalman Filter.

edit flag offensive delete link more

Comments

Thanks a lot for your answer and clarifying that tf is not the way to go. Up to now I'm fairly certain that combining the GPS with a kalman filter is the way to go. In the blog post (as far as I get it) the orientation of the robot is only updated differentially from a given theta_0.

Simon Harst gravatar image Simon Harst  ( 2014-07-02 04:15:56 -0500 )edit

It's dawning on me that I can't have both: Absolute orientation from the GPS and an offset in the data.. I'll accept your answer because I think combining the GPS with measurements from other sensors in a kalman filter is the closest I can get to answering my problem! Thanks again!

Simon Harst gravatar image Simon Harst  ( 2014-07-02 04:18:01 -0500 )edit
0

answered 2014-06-24 07:48:33 -0500

ct2034 gravatar image

updated 2014-06-27 04:55:37 -0500

Have a look at the tf package: http://wiki.ros.org/tf
It seems to be right what you need:

tf is a package that lets the user keep track of multiple coordinate frames over time. tf maintains the relationship between coordinate frames in a tree structure buffered in time, and lets the user transform points, vectors, etc between any two coordinate frames at any desired point in time.

Good luck with you project. Sounds interesting :)

EDIT: My suggestion:

With tf you can set up a "tree" of "frames". You have to set up the relation between the gps position and the car once, as descried here: http://wiki.ros.org/tf/Tutorials/Addi... . T
hen you have to publish the gps position to tf http://wiki.ros.org/tf/Tutorials/Writ... and listen to its transformation: http://wiki.ros.org/tf/Tutorials/Writ... . This will contain your angle alpha as a quaternion http://en.wikipedia.org/wiki/Quaternion which can be transformed relatively easy to an normal angle.

At least that what I think how it works. The upside is, that tf does all the work for you. It is originally designed for bigger numbers of frames. But it should work for your problem, too.

edit flag offensive delete link more

Comments

Thanks, I will look at it again - at first try I didn't manage to get the rotations right. I've written some code myself now which (kind of) works, and I will post it here when I've tested it and cleaned it up. Any help with setting up tf for this problem would still be most appreciated.

Simon Harst gravatar image Simon Harst  ( 2014-06-24 09:15:08 -0500 )edit

Okay, looking at it again I remember why I didn't manage to set it up:I don't see any way that tf would calculate the angle alpha for me. Of course I can write a transform from gps-->robot_base myself, for which I however still need to know alpha and solve the question above. Am I missing something?

Simon Harst gravatar image Simon Harst  ( 2014-06-27 01:19:20 -0500 )edit

please see my edit

ct2034 gravatar image ct2034  ( 2014-06-27 04:55:55 -0500 )edit

tf won´t help here, as orientation has to be estimated from position only measurements.

Stefan Kohlbrecher gravatar image Stefan Kohlbrecher  ( 2014-06-28 08:43:02 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2014-06-24 06:59:36 -0500

Seen: 774 times

Last updated: Jun 28 '14