ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

Optitrack Motion Capture system and robot localization in Nav2 stack

asked 2022-09-23 07:59:11 -0600

ljaniec gravatar image

updated 2022-10-06 12:48:26 -0600

Hello,

I'm looking for some tutorials, documentation, code/configuration snippets, example projects and other sources with information on how I can incorporate an external Motion Capture System (Optitrack + Motive 3.0.1 cameras) as a really reliable (99% of time) localization source in a Nav2-based Turtlebot 2 motion control (upgraded to ROS 2 Galactic at the moment).

For connection of Motive and ROS2 I am using this project:

I can get the location of the markers through the /markers topic as in this Getting Started tutorial, but I'm not sure how to combine this information in the robot's Nav2 stack.

Some mentions of it are here and (maybe) here (as in a fused and locally accurate smooth odometry information from the data provided by N odometry sensor inputs).

This tutorial is a bit hard to follow, but maybe it could be the base of a solution for MoCap-supported localization?

I think it should be somethink similar to this Q&A, this or this "Integrating GPS Data" in robot_localization, but for ROS2. I would like some guidance here, though.

Other related things I found:

Please advise, any hints are welcome!

Best

Łukasz Janiec

edit retag flag offensive close merge delete

Comments

Is it generating pose or just x,y,z location? Approximately how often does this system generate a current pose estimate? What do you think the accuracy is?

Mike Scheutzow gravatar image Mike Scheutzow  ( 2022-10-08 09:57:26 -0600 )edit

It uses this RigidBody.msg, that's it:

std_msgs/Header header

uint32 frame_number
string rigid_body_name
Marker[] markers
geometry_msgs/Pose pose

Optitrack gave me in a rosbag of 20 s 2350 frames, so it is probably ~100-120 Hz? Correct me if I am wrong. For my supervisor controller for multiple AMRs I need a current pose estimate on the level around 10-20 Hz. I am not sure how often should it be sent in Nav2 robot_localization. After calibration, Motive 3 shows an error of 0.5 mm per marker, I'm not sure how strongly this will correlate to the robot's overall localization error (each robot has 4 markers).

ljaniec gravatar image ljaniec  ( 2022-10-10 01:28:19 -0600 )edit

1 Answer

Sort by » oldest newest most voted
2

answered 2022-10-10 18:56:29 -0600

Mike Scheutzow gravatar image

With a high quality global localization like you describe, you don't really need standard odometry techniques (or a kalman filter.) You could just use the pose directly. The TF tree could look like this: map -> optitrack -> base_link, where map->optitrack is a static transform.

You could add odometry to the above TF tree, but setting that up is more complicated. As with AMCL, you'd then use the optitrack pose to correct the drift in the odometry.

edit flag offensive delete link more

Comments

1

AMCL and other global localizer‘s provide the map to odometry frame transformation. So really, all you need to do here is provide the same thing. If you have globally accurate localization from an external source, you may use that. So just don’t launch a AMCL and create a note which publishes that transform using your motion capture system data. Depending on the quality of the data, it may be advantageous to use a kalmon filter, so RL can provide that. But if filtering is not required, that should be sufficient. But conveniently RL does provide that transformation automatically when setup to provide that data.

stevemacenski gravatar image stevemacenski  ( 2022-10-11 00:36:08 -0600 )edit

Question Tools

3 followers

Stats

Asked: 2022-09-23 07:59:11 -0600

Seen: 713 times

Last updated: Oct 10 '22