Octomap_server + RealSense D435 + RealSense T265
Hello everyone,
I would like to create simple octomap with use of both RealSense D435 and T265 cameras. I am using my own RealSense wrapper (do not ask why please, it is needed in this case). I can publish PointCloud from D435 camera (depth stream) and pose (odometry) from T265 as a topic.
What I am currently struggling with is publishing tf
. I know, that octomap_server
can incrementally build occupancy grid based on odometry received from tf
. I just cannot figure out, how to setup frame_id's and so.
This is how I broadcast transform:
tf_br.sendTransform(tf::StampedTransform(transform, ros::Time::now(), "tf_t265", "map"));
The transform
object is filled with the exact same data as I am using in the topic for publishing T265 pose (which is being used and it works).
This is what the part of my launchfile for octomap_server looks like:
<node pkg="octomap_server" type="octomap_server_node" name="octomap_server">
<param name="resolution" value="0.05" />
<!-- fixed map frame (set to 'map' if SLAM or localization running!) -->
<param name="frame_id" type="string" value="tf_t265" />
<!-- maximum range to integrate (speedup!) -->
<param name="sensor_model/max_range" value="5.0" />
<!-- data source to integrate (PointCloud2) -->
<remap from="cloud_in" to="/d435/depth" />
</node>
I suspect, that frame_id
param is the key here, but I tried to set it to map and it does not work too. I know, that maybe I am publishing the transform incorrectly, but this is why I am asking for help. There is nothing else in my code related with tf broadcasting.
If any additional info is needed, please let me know. Thank you for any help!
Have a nice day.
I know you said don't ask but I'm genuinely curious ... (and I assume it has more to do with the t265 than the d435) ... why did you write your own RealSense wrapper :D ?
Because I need only basic functionality from librealsense inside ROS. Furthermore I am developing application on Nvidia Jetson and D435+T265 simultaneously don't even work with RealSense wrapper. I am also maintaining a big ros package, where I have my own realsense node. If I was using realsense-ros as a module it would be difficult to maintain, because it is under heavy development and so it is updated too often. When I update core librealsense, my ROS wrapper still works. The realsense ROS wrapper is not hard to write though. + I needed to implement some features which weren't implemented (I dont know if it's still the case) in original RealSense ROS wrapper such as T265 complete camera reset (start/stop pipeline) or odometry Set (setting odometry to specific pose with service using transformation). It is much more flexible now for me as I have stable setup consisting ...(more)
That makes a lot of sense. Is any is this open source perchance? I'd like to look through it if that's the case.
No, it's not available anywhere for now.
Hi @eMrazSVK I am trying to perform the same as you in octomap with T265 and D435i. Currently(contrary to you I am using the realsenselib) The pose and depth point cloud are working as expected( light modifications to the rs_d400_and_t265.launch). When I run the octomap-server, after a few secs, I get MessageFilter [target=/t265/odom/sample ]: Dropped 100.00% of messages so far. Any idea what that might be? Any help would be much appreciated.