ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

It is possible, certainly. The quality of the output will be heavily dependent on the input, however.

Have you read through the wiki? I realize it's a lot to take in, but if you had some feedback as to which parts could use some clarification, I'd be happy to update them.

One things to be aware of with your IMU is that the state estimation nodes in robot_localization assume that the IMU data is in the ENU frame, and IMUs commonly report data in NED. What you really want is for the signs of your rotation angles to increase in the correct direction.

Just an FYI: if you integrate GPS data, your position estimate will be subject to discrete jumps. The accelerometers will smooth that out a little bit, but they won't eliminate it.

Do you have an altimeter, or are you going to use the GPS altitude?

Finally, if you can find a lightweight ARM-friendly visual odometry node, then yes, having more data to fuse is always better. I haven't had much luck on that front, however, as many of the packages are optimized for x86/x86-64 CPUs.

It is possible, certainly. The quality of the output will be heavily dependent on the input, however.

Have you read through the wiki? I realize it's a lot to take in, but if you had some feedback as to which parts could use some clarification, I'd be happy to update them.

One things thing to be aware of with your IMU is that the state estimation nodes in robot_localization assume that the IMU data is in the ENU frame, and IMUs commonly report data in NED. What you really want is for the signs of your rotation angles to increase in the correct direction.

Just an FYI: if you integrate GPS data, your position estimate will be subject to discrete jumps. The accelerometers will smooth that out a little bit, but they won't eliminate it.

Do you have an altimeter, or are you going to use the GPS altitude?

Finally, if you can find a lightweight ARM-friendly visual odometry node, then yes, having more data to fuse is always better. I haven't had much luck on that front, however, as many of the packages are optimized for x86/x86-64 CPUs.

It is possible, certainly. The quality of the output will be heavily dependent on the input, however.

Have you read through the wiki? I realize it's a lot to take in, but if you had some feedback as to which parts could use some clarification, I'd be happy to update them.

One thing to be aware of with your IMU is that the state estimation nodes in robot_localization assume that the IMU data is in the ENU frame, and IMUs commonly report data in NED. What you really want is for the signs of your rotation angles to increase in the correct direction.

Just an FYI: if you integrate GPS data, your position estimate will be subject to discrete jumps. The accelerometers will smooth that out a little bit, but they won't eliminate it.

Do you have an altimeter, or are you going to use the GPS altitude?

Finally, if you can find a lightweight ARM-friendly visual odometry node, then yes, having more data to fuse is always better. I haven't had much luck on that front, however, as many of the packages are optimized for x86/x86-64 CPUs.

EDIT 1 (in response to EDIT 1 above): Which version of the software are you using? I'll look into the template covariance issue. It used to be that all the values of the covariance matrix had to have decimal points, but in the latest version (not yet released), it can take values without decimal points.

When you say "it seems like it has a lot of noise," can you quantify the error/noise you're seeing? If you can, it would be very helpful if you could post (1) a sample message for each input to ekf_localization_node, and (2) your launch file for ekf_localization_node.

It is possible, certainly. The quality of the output will be heavily dependent on the input, however.

Have you read through the wiki? I realize it's a lot to take in, but if you had some feedback as to which parts could use some clarification, I'd be happy to update them.

One thing to be aware of with your IMU is that the state estimation nodes in robot_localization assume that the IMU data is in the ENU frame, and IMUs commonly report data in NED. What you really want is for the signs of your rotation angles to increase in the correct direction.

Just an FYI: if you integrate GPS data, your position estimate will be subject to discrete jumps. The accelerometers will smooth that out a little bit, but they won't eliminate it.

Do you have an altimeter, or are you going to use the GPS altitude?

Finally, if you can find a lightweight ARM-friendly visual odometry node, then yes, having more data to fuse is always better. I haven't had much luck on that front, however, as many of the packages are optimized for x86/x86-64 CPUs.

EDIT 1 (in response to EDIT 1 above): Which version of the software are you using? I'll look into the template covariance issue. It used to be that all the values of the covariance matrix had to have decimal points, but in the latest version (not yet released), it can take values without decimal points.

When you say "it seems like it has a lot of noise," can you quantify the error/noise you're seeing? If you can, it would be very helpful if you could post (1) a sample message for each input to ekf_localization_node, and (2) your launch file for ekf_localization_node.

EDIT 2 (in reponse to edits 2 and 3 above):

First, your rectangular plot looks like it has 90-degree angles in it to me. Make sure you do axis equal when you plot things in MATLAB. Remember: the /odometry/gps data will not align perfectly with your X and Y axes when you plot it. It's going to be based on whatever yaw value the IMU had when it received the GPS data. In other words, the first MATLAB plot looks solid to me.

Second, I'm a bit confused as to your setup. What is your odom topic (the odom0 input source)? I don't see you feeding the /odometry/gps topic back into the ekf_localization_node, which is how it's intended to be used.

It is possible, certainly. The quality of the output will be heavily dependent on the input, however.

Have you read through the wiki? I realize it's a lot to take in, but if you had some feedback as to which parts could use some clarification, I'd be happy to update them.

One thing to be aware of with your IMU is that the state estimation nodes in robot_localization assume that the IMU data is in the ENU frame, and IMUs commonly report data in NED. What you really want is for the signs of your rotation angles to increase in the correct direction.

Just an FYI: if you integrate GPS data, your position estimate will be subject to discrete jumps. The accelerometers will smooth that out a little bit, but they won't eliminate it.

Do you have an altimeter, or are you going to use the GPS altitude?

Finally, if you can find a lightweight ARM-friendly visual odometry node, then yes, having more data to fuse is always better. I haven't had much luck on that front, however, as many of the packages are optimized for x86/x86-64 CPUs.

EDIT 1 (in response to EDIT 1 above): Which version of the software are you using? I'll look into the template covariance issue. It used to be that all the values of the covariance matrix had to have decimal points, but in the latest version (not yet released), it can take values without decimal points.

When you say "it seems like it has a lot of noise," can you quantify the error/noise you're seeing? If you can, it would be very helpful if you could post (1) a sample message for each input to ekf_localization_node, and (2) your launch file for ekf_localization_node.

EDIT 2 (in reponse to edits 2 and 3 above):

First, your rectangular plot looks like it has 90-degree angles in it to me. Make sure you do axis equal when you plot things in MATLAB. Remember: the /odometry/gps data will not align perfectly with your X and Y axes when you plot it. It's going to be based on whatever yaw value the IMU had when it received the GPS data. In other words, the first MATLAB plot looks solid to me.

Second, I'm a bit confused as to your setup. What is your odom topic (the odom0 input source)? I don't see you feeding the /odometry/gps topic back into the ekf_localization_node, which is how it's intended to be used.

Also, can you post either a bag file of your run, or perhaps a printout of the /odometry/filtered topic?