ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org
Ask Your Question
0

Tracking a glove using robot_localization package.

asked 2018-06-06 20:19:13 -0500

babazaroni gravatar image

I would like to track a glove that has an imu attached. Using another method, such as a fully convoluted network working with an rgbd camera, occasional reliable estimates of the gloves position in 3d can be made. The imu would be used to fill in the gaps between the more accurate estimates. So, I'm wondering how to configure the robot_localization package to do this.

Can the occasional good estimate of the position come in to the package's ekf node via an odom message? Or does this node require a more regular position like a normal message from a wheel odometer source?

Or, should the occasional good estimate of position go to the navsat node provided by the robot_localization package which handles occasional position updates like a gps would provide. The navsat node then feeds ekf filter node via odom messages. Are these odom messages from the navsat node continuous, or do you get just one for each occasional good position estimate.

Another issue to think about is since the hand can move freely in 3d space, the orientation of the imu will not be a clue to its heading. However, the imu linear acceleration values will indicate the heading. Can the ekf node use these values directly to deduce heading, or does heading need to be calculated before feeding the ekf node?

Thanks for any thoughts on this.

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
1

answered 2018-06-07 08:01:08 -0500

Tom Moore gravatar image

Can the occasional good estimate of the position come in to the package's ekf node via an odom message? Or does this node require a more regular position like a normal message from a wheel odometer source?

It depends on the quality and frequency of the pose data. The real challenge is going to be using the IMU data. We don't have bias estimation in the filter for IMUs, so the accelerations will rapidly integrate into non-trivial velocities. The pose updates will limit the effect of this, but my experience has been that just pose + IMU is not a great use case. YMMV.

Or, should the occasional good estimate of position go to the navsat node provided by the robot_localization package which handles occasional position updates like a gps would provide. The navsat node then feeds ekf filter node via odom messages. Are these odom messages from the navsat node continuous, or do you get just one for each occasional good position estimate.

No reason to bring navsat_transform_node into the picture. It would just end up sending pose data to the EKF at the same frequency as the sensor.

Another issue to think about is since the hand can move freely in 3d space, the orientation of the imu will not be a clue to its heading. However, the imu linear acceleration values will indicate the heading. Can the ekf node use these values directly to deduce heading, or does heading need to be calculated before feeding the ekf node?

The latter. The filter doesn't do anything especially clever with sensor inputs, other than limiting which ones get fused, and things like coordinate transformations. If you want to infer some new bit of orientation data from the accelerometers, you can either write a new node, or modify your IMU driver to put that data into the orientation fields.

edit flag offensive delete link more
1

answered 2018-06-07 07:59:34 -0500

stevejp gravatar image

Can the occasional good estimate of the position come in to the package's ekf node via an odom message? Or does this node require a more regular position like a normal message from a wheel odometer source?

Yes you can fuse your occasional good position estimate via an odom message, or just a pose message if you aren't getting velocity estimates from your rgbd-camera. In the case you described, your pose will drift in between rgbd-camera measurements, so the faster you can send robot_localization the rgbd-camera odometry the better.

Or, should the occasional good estimate of position go to the navsat node provided by the robot_localization package which handles occasional position updates like a gps would provide. The navsat node then feeds ekf filter node via odom messages. Are these odom messages from the navsat node continuous, or do you get just one for each occasional good position estimate.

The only purpose the navsat transform node serves is to convert gps measurements (latitude-longitude) into map frame x-y odometry. If you aren't using a gps, there is no reason to use the navsat transform node.

Another issue to think about is since the hand can move freely in 3d space, the orientation of the imu will not be a clue to its heading. However, the imu linear acceleration values will indicate the heading. Can the ekf node use these values directly to deduce heading, or does heading need to be calculated before feeding the ekf node?

What imu are you using? If your imu is properly calibrated and has internal AHRS, it should provide a full globally referenced (e.g., ENU / NED) orientation estimate. If your imu doesn't have internal AHRS, you could use the raw sensor measurements (i.e., accelerometers, gyros, magnetometers) to obtain an orientation estimate using another filtering node (e.g., madgwick filter), prior to fusing that orientation estimate in robot_localization. If you only fuse accelerometers, r_l won't be able to deduce a full orientation estimate.

edit flag offensive delete link more

Comments

The imu is bnu055, which has AHRS. That section of my post reflected a poor understanding of what the filter does. I'll just have to experiment with synthetic data and see what the filter puts out.

babazaroni gravatar imagebabazaroni ( 2018-06-07 10:22:42 -0500 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

1 follower

Stats

Asked: 2018-06-06 20:19:13 -0500

Seen: 54 times

Last updated: Jun 07 '18