ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange

# Get device position from IMU

Hi there, this is my first time dealing with IMUs. I have a problem where a device with a IMU made a path. I have measurements of the accelerometer, gyroscope and magnetometer over some time and an initial position in lat/lon values, I need to estimate the lat/lon for the rest of the data. I know that this will have a big error due to drift and other things but for I'm still curious to see the results. My data is already calibrated, and I am using the Madgwick filter to obtain the orientation in quaternion. What I am having a hard time is in figuring it out what to do next.

I got the quaternion and calculated the rotation matrix, from that I multiplied the acceleration with the inverse of the rotation matrix. (I already removed the gravity effect).

linear_acceleration =np.matmul(acc,R.T)

As for the lat/lon values, I converted these to ECEF:

ecef = pyproj.Proj(proj='geocent', ellps='WGS84', datum='WGS84')
lla = pyproj.Proj(proj='latlong', ellps='WGS84', datum='WGS84')
x, y, z = pyproj.transform(lla, ecef, lon, lat, alt, radians=False)
return x, y, z


and only after getting the position did i convert them back to lat/lon values:

ecef = pyproj.Proj(proj='geocent', ellps='WGS84', datum='WGS84')
lla = pyproj.Proj(proj='latlong', ellps='WGS84', datum='WGS84')
lon, lat, alt = pyproj.transform(ecef, lla, p, p, p, radians=False)
return np.array([lat,lon,alt]


I did some tests and while some are not awful, other estimate my position to go on a completely wrong path (some go on a 180º difference direction).

I also decided to try getting the orientation and then position of the device just using the yaw value. I converted the quaternion to euler angles, and then used the formula of "Destination point given distance and bearing from start point" found here: http://www.movable-type.co.uk/scripts....

lat = math.radians(lat)
distance = distance *0.001 #to km
bearing_deg = bearing_deg+180
bearing_deg %=360 #put bearing between 0 and 360

lat2 = math.asin(math.sin(lat)*math.cos(distance/E) + math.cos(lat)*math.sin(distance/E)*math.cos(bearing))
lon2 = lon + math.atan2(math.sin(bearing)*math.sin(distance/E)*math.cos(lat),math.cos(distance/E)-math.sin(lat)*math.sin(lat2))
# Normalise to -180..180
lon2 = (lon2+540)%360-180
# convert to lat/lon degrees
return lat_degree,lon_degree


Again some examples are having ok results, but others are just outputting angles all over the place and in the wrong direction. What am I missing? What is the best strategy? Should I also use the pitch and roll angles together with yaw for more accurate results? Do I need to adjust my angles/quaternion?

edit retag close merge delete

Sort by » oldest newest most voted

I think your best bet might be using navsat_transform_node . This package uses IMU heading and the initial gps fix to create a datum for the rest of the gps data that is published. You can fuse this data using the extended kalman filter provided in the robot_localization package as well to fuse the data from the IMU and GPS data, where the updated GPS data would hopefully correct the estimated position.

The EKF performance will probably be pretty bad without additional odometry data (wheel encoders or visual odometry), but the navsat_transform_node will at least publish odometry data in the robot's coordinate frame that will make what you're trying to do a little easier. Hopefully that helps.

more