ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Error while using rtabmap with remote D435i input: TF_OLD_DATA

asked 2020-02-28 08:53:44 -0500

Lisa514 gravatar image

Hi, I'm fairly new to ROS and apologize in advance if my question is stupid.

What I'm trying to do is to have a rover carrying a Pi and camera remote mapping its environment while the main computing is executed on my laptop. To get a start I followed this installation guide: https://github.com/IntelRealSense/rea... for realsense2_camera, imu_filter_madgwick, rtabmap_ros and robot_localization on the laptop only, launching the file included in the package, opensource_tracking.launch: https://github.com/IntelRealSense/rea.... This worked fine. The problem emerged when I moved some of the nodes to the Pi.

Setup:

-Intel RealSense D435i camera connected to Raspberry Pi 4 (Raspbian Buster, ROS Kinetic) with USB 3

-Laptop (Ubuntu 16.04, ROS Kinetic) currently connected to Pi via Ethernet

What I did to distribute the nodes:

-Splitting the launch file in two: only the first include block for rs_camera.launch is contained in the launch file on the Pi, the rest stays on the laptop

-In rs_camera.launch: Setting the frame rates to 6 (due to prior Error: UBS overflow)

-In rtabmap.launch: Setting queue_size=1000, rgbd_sync=true, compressed=true

The Error:

Now, when launching the launch file on the Pi, the system takes some time to stop throwing Errors, but after that the camera topics are available on the laptop. The output of launching it is rather long: https://pastebin.com/fUn4YZXQ

When I the proceed to launch on the laptop, this is the output: https://pastebin.com/2VNe1X5s

I have tried rviz and rtabmapviz, both just show an empty window without mapping

What I have tried so far:

I run roswtf on the laptop with the following result:

    Loaded plugin tf.tfwtf
No package or stack in context
================================================================================
Static checks summary:

No errors or warnings
================================================================================
Beginning tests of your ROS graph. These may take awhile...
analyzing graph...
... done analyzing graph
running graph rules...
... done running graph rules
running tf checks, this will take a second...
... tf checks complete

Online checks summary:

Found 1 warning(s).
Warnings are things that may be just fine, but are sometimes at fault

    WARNING The following node subscriptions are unconnected:
     * /rtabmap/rtabmap:

   * /rtabmap/initialpose
   * /gps/fix
   * /rtabmap/global_pose
   * /rtabmap/goal
   * /user_data_async
   * /rtabmap/goal_node
 * /ukf_se:
   * /example/twist
   * /set_pose
   * /example/another_odom
   * /example/pose
 * /rtabmap/rtabmapviz:
   * /rtabmap/goal_node
 * /rtabmap/republish_rgb:
   * /camera/color/image_raw/compressed
 * /rtabmap/republish_depth:
   * /camera/aligned_depth_to_color/image_raw/compressedDepth

I checked that both devices are synchronized to a ntp server.

Also, I plotted the transformation tree in both situations. First all nodes on the laptop and working: https://drive.google.com/file/d/17d9a...

And with distributed nodes: https://drive.google.com/file/d/1XNni...

From reading on this forum I figured that the negative age of the transform from 'map' can be the problem an that hitting the reset button in rviz can help, however it doesn't in this case. I also read that diving into the source code and looking for usages of time::now() to get the latest transform and change them ... (more)

edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted
2

answered 2020-03-02 16:31:15 -0500

matlabbe gravatar image

On rtabmap side, image topics cannot be synchonized, from your second log:

[ WARN] [1582896218.176834702]: /rtabmap/rgbd_sync: Did not receive data since 5 seconds! Make sure the input topics are published ("$ rostopic hz my_topic") and the timestamps in their header are set.
/rtabmap/rgbd_sync subscribed to (approx sync):
   /camera/color/image_raw_relay,
   /camera/aligned_depth_to_color/image_raw_relay,
   /camera/color/camera_info

Do

rostopic hz /camera/color/image_raw_relay /camera/aligned_depth_to_color/image_raw_relay /camera/color/camera_info

to see if you are receiving all topics on your laptop. Those topics are created from relays launched by rtabmap.launch for convenience (to avoid multiple remote subscriptions). The actual topics to check are:

rostopic hz /camera/color/image_raw/compressed /camera/aligned_depth_to_color/image_raw/compressedDepth /camera/color/camera_info

Try this command on the laptop and on the RPI, to see the difference of frame rate. An issue could be that RPI has difficulty to compress all data, look at its CPU usage.

If you want to to use your robot odometry (from the ukf_se node), set visual_odometry to false and odom_frame_id to odom in rtabmap.launch. Note also that approx_rgbd_sync would be set to false for realsense data. You can keepqueue_size set to default (10).

cheers,
Mathieu

edit flag offensive delete link more
1

answered 2021-04-01 06:15:40 -0500

ciarfah gravatar image

Hi Lisa,

This is a bit late but I ran into a similar error when running the realsense camera over the network. rostopic hz my_topic can be a bit misleading as if everything is set up correctly, they will all appear to be publishing correctly! I found that the error occurred when the network link could not support the data throughput required. I reckon this is because the latency becomes so high that the timestamps read as too old to use. I see you are using wired Ethernet, but even transmitting just the aligned depth image and RGB at 424x240 and 640x360 respectively, for me, takes 48.9MBps (That's megabytes, not bits, to be clear) over Ethernet. The Pi 4 is not capable of going over around 51MBps from my tests, at least in this use-case, but it may be my router that is the limiting factor. You may be able to get this working if you reduce the data throughput.

Try running rs-enumerate-devices and choosing the lowest resolution and FPS settings. For me roslaunch realsense2_camera rs_camera.launch initial_reset:=true depth_width:=424 depth_height:=240 depth_fps:=15 color_width:=424 color_height:=240 color_fps:=15 align_depth:=true only needs 22MBps. You can also lower the gyro and accelerometer FPS to 200 and 63 respectively.

As a sidenote, there is an issue with the gyro and accelerometer of the D435i, specifically on the Pi I believe, that causes them to simply stop publishing data after a while. There is an open ticket with Intel to look at it but it may be a while until this is fixed. If you have had any luck with getting this working let me know.

edit flag offensive delete link more

Comments

Hi, thanks for taking the time to answer to my old post. I think you are right and have since come to the conclusion that link qualitiy and computing power were just insufficient. The Pi could not keep up with compressing the data. Trying to send uncompressed data however resulted in way too many package resend attempts. Either way the pi was at 100 percent all the time with tasks failing. I tried lower frame rates but they called for very slow movement to get a coherent point cloud, which quickly got unsustainable for my problem. Thank you very much anyway for your response.

Lisa514 gravatar image Lisa514  ( 2021-04-02 07:19:38 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2020-02-28 08:52:30 -0500

Seen: 619 times

Last updated: Mar 02 '20