Indoor navigation using odom & beacons - how to fuse it together?

asked 2018-02-05 20:06:35 -0500

shreks7 gravatar image

updated 2018-02-05 23:21:39 -0500

Hi,

We're working on a research project where we have an indoor drone and a system that uses UHB (40Hz) to track its position.

  1. We're running EKF2 on the drone itself to retrieve odometry data. We get a nav_msg/Odometry from the drone to a base station running ROS master. Due to our hardware design, we can't retrieve raw sensor data but only the final /odom is published.

  2. We were able to command the drone to navigate using only /odom (move forward by x m at y m/s). But it drifts away before reaching its goal. We're not using move_base but just a simple combination of TF (odom->base_link) & cmd_vel to navigate.( Basically from my understanding, this is what the drone believes it is and should be).

    1. Now the indoor tracking system is accurate up to 2cm (but still jumps a little, we are working on it), it publishes a custom message to topic /beacon/pos with x, y, z and a timestamp in milliseconds.

    2. We wrote a TF publisher which publishes this xyz as nav_msgs::Odometry to /map and has a child_frame_id as odom.

    3. We can see both the odom and the indoor system. But note the odom is different from the one reported by our indoor system. (even at start).

    4. We also have systems like MarvelMind and Pozyx to test our system's accuracy.

Questions:

  1. How do we create a tf link between map & odom, is this a static transform that we can create? (Simple as moving the start pose of odom to match with map?) For now, we just published that data via a publisher like this -

broadcaster.sendTransform(tf::StampedTransform(tf::Transform(tf::Quaternion(0,0,0,1), tf::Vector3(x,y,z)), ros::Time::now(), "map", "odom"));

Is this sufficient?

  1. How do we fuse this data together so that we can give a goal in the rviz ? robot_localization looks promising, just launch one EKF with input /odom (Drone) & /odom from the indoor system? Will this be sufficient to navigate on a path (like a square or circle)?

I'm not sure if I am making it very clear, but I tried. Hope someone can help us out.

Thank you. Please let me know if there is something else I should share.

[UPDATE]

We were able to get the map->odom->base_link and it looks like this (green is odom from drone and red is from our system ) We can ignore the direction since we only have x,y,z values from our system, but it feels its mirrored or not aligned. How do we fix this? I am assuming that the green ones should align with red ones (not the direction but x y & z)

image description

And obviously the EKF jumps - image description

edit retag flag offensive close merge delete

Comments

What I did not understood is: Why do you bother with the map frame if you have a static transform from map to odom? Wouldn't it be better to configure everything to use only odom frame?

Humpelstilzchen gravatar image Humpelstilzchen  ( 2018-02-06 04:09:40 -0500 )edit

I am not sure if that is a static transform or not. Right now I just get the x,y,z position from the UHB sensor and create a transform from map->odom. I am not sure of this part.

shreks7 gravatar image shreks7  ( 2018-02-06 13:21:05 -0500 )edit

Hi @shreks7 did you manage to solve your issues? I want to do something similar, see my query https://answers.ros.org/question/2840... . If you have any suggestions, please let us know!

simff gravatar image simff  ( 2018-03-04 15:21:57 -0500 )edit