Indoor navigation using odom & beacons - how to fuse it together?
Hi,
We're working on a research project where we have an indoor drone and a system that uses UHB (40Hz) to track its position.
We're running EKF2 on the drone itself to retrieve odometry data. We get a nav_msg/Odometry from the drone to a base station running ROS master. Due to our hardware design, we can't retrieve raw sensor data but only the final /odom is published.
We were able to command the drone to navigate using only /odom (move forward by x m at y m/s). But it drifts away before reaching its goal. We're not using move_base but just a simple combination of TF (odom->base_link) & cmd_vel to navigate.( Basically from my understanding, this is what the drone believes it is and should be).
Now the indoor tracking system is accurate up to 2cm (but still jumps a little, we are working on it), it publishes a custom message to topic /beacon/pos with x, y, z and a timestamp in milliseconds.
We wrote a TF publisher which publishes this xyz as nav_msgs::Odometry to /map and has a child_frame_id as odom.
We can see both the odom and the indoor system. But note the odom is different from the one reported by our indoor system. (even at start).
We also have systems like MarvelMind and Pozyx to test our system's accuracy.
Questions:
- How do we create a tf link between map & odom, is this a static transform that we can create? (Simple as moving the start pose of odom to match with map?) For now, we just published that data via a publisher like this -
broadcaster.sendTransform(tf::StampedTransform(tf::Transform(tf::Quaternion(0,0,0,1), tf::Vector3(x,y,z)),
ros::Time::now(), "map", "odom"));
Is this sufficient?
- How do we fuse this data together so that we can give a goal in the rviz ? robot_localization looks promising, just launch one EKF with input /odom (Drone) & /odom from the indoor system? Will this be sufficient to navigate on a path (like a square or circle)?
I'm not sure if I am making it very clear, but I tried. Hope someone can help us out.
Thank you. Please let me know if there is something else I should share.
[UPDATE]
We were able to get the map->odom->base_link and it looks like this (green is odom from drone and red is from our system ) We can ignore the direction since we only have x,y,z values from our system, but it feels its mirrored or not aligned. How do we fix this? I am assuming that the green ones should align with red ones (not the direction but x y & z)
And obviously the EKF jumps -
What I did not understood is: Why do you bother with the map frame if you have a static transform from map to odom? Wouldn't it be better to configure everything to use only odom frame?
I am not sure if that is a static transform or not. Right now I just get the x,y,z position from the UHB sensor and create a transform from map->odom. I am not sure of this part.
Hi @shreks7 did you manage to solve your issues? I want to do something similar, see my query https://answers.ros.org/question/2840... . If you have any suggestions, please let us know!