ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

AprilTag Camera_Info Synchronization

asked 2022-05-31 15:09:22 -0500

Moop gravatar image

Hi, I'm trying to get some AprilTag detections going and have two semi-related issues. I have a video stream coming from a camera in UYVY format that I'm trying to do detection on. When I try to run the continuous detection node I get the following error:

NODES / apriltag_ros_continuous_node (apriltag_ros/apriltag_ros_continuous_node)


process[apriltag_ros_continuous_node-1]: started with pid [23411] OpenCV Error: Assertion failed (scn == 3 || scn == 4) in cvtColor, file /build/opencv-XDqSFW/opencv-3.2.0+dfsg/modules/imgproc/src/color.cpp, line 9748 terminate called after throwing an instance of 'cv::Exception' what(): /build/opencv-XDqSFW/opencv-3.2.0+dfsg/modules/imgproc/src/color.cpp:9748: error: (-215) scn == 3 || scn == 4 in function cvtColor

[apriltag_ros_continuous_node-1] process has died [pid 23411, exit code -6, cmd /home/matt/AprilTag_Test/devel/lib/apriltag_ros/apriltag_ros_continuous_node image_rect:=/basler_camera/image_raw camera_info:=/basler_camera/camera_info __name:=apriltag_ros_continuous_node __log:=/home/matt/.ros/log/49ee0aa4-e117-11ec-ae5d-5ca6e636641b/apriltag_ros_continuous_node-1.log]. log file: /home/matt/.ros/log/49ee0aa4-e117-11ec-ae5d-5ca6e636641b/apriltag_ros_continuous_node-1*.log all processes on machine have died, roslaunch will exit shutting down processing monitor... ... shutting down processing monitor complete done

Looking around this seems to be an error message OpenCV throws when it gets a malformatted image. I'm able to confirm that the image is read properly by ROS with both rqt_image_view and image_proc so I don't think that's the issue. I'm wondering if it might just be that AprilTag doesn't support UYVY? I went to convert it to RGB by piping it through image_proc to see if it could get it to work. The image conversion succeeds, but the AprilTag node then throws a warning about the camera_info and image messages being unsynchronized and I don't get any detections. The tag_detections_image comes out blank and I get nothing on any of the tag_detections topics when doing a rostopic echo

NODES / apriltag_ros_continuous_node (apriltag_ros/apriltag_ros_continuous_node)


process[apriltag_ros_continuous_node-1]: started with pid [24442] Image messages received: 148 CameraInfo messages received: 269 Synchronized pairs: 0 Image messages received: 151 CameraInfo messages received: 271 Synchronized pairs: 0

Does anybody know if the AprilTag node 100% requires that the image and camera_info messages be synchronized and will it not do tag detection otherwise? One could easily need to do some intermediate image processing, say like rectification with an existing node that may not want to resync and republish the camera_info topic. Are there any workarounds to get them in sync, aside from making a repeater node?

As a misc point, I have not yet calibrated the camera and so I'm just supplying a made up focal length with the image size for now but that should only affect the pose estimation, not the ability to detect tags I would think? I tested running the single detector node and got successful tag detections so I don't think that's the issue.

If it matters I'm on the Jetson Nano running Melodic + Jetpack 4.6.1 / Ubuntu 18.04

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2022-06-03 08:45:37 -0500

Moop gravatar image

Resolved, issues were misunderstandings on my part as a new user.

When I flipped to RGB or mono incoming images the AprilTag node ran properly, but realistically they should be mono anyways most likely so feeding in UYVY wasn't the smartest (was just trying to test interfaces initially). Discovered this after getting rectification running properly with image_proc and seeing that the default input hooks up to the mono rectified output name. Reason for UYVY initially was because that's what the camera delivers.

The synchronization issue was a combination of an improper driver and too high res images for the Nano to handle. I learned after purchasing that the manufacturer ROS driver does not support my camera (Basler daA2500-60 mci as a heads up so maybe somebody else doesn't make the same bad purchase I did) so I had to write a simple node using their C++ API. I was not aware of image_transport::CameraPublisher initially so I was just publishing the image and camera parameters as normal messages. I was doing debayering with image_proc but the publish rate of the images was faster than the Nano could do the debayering so the output image was always out of sync with the camera_info topic. Decreasing the resolution and frame rate of the images allowed image_proc to keep up so that the debayered/rectified images and camera_info were in sync which resulted in the detections working immediately.

edit flag offensive delete link more

Question Tools


Asked: 2022-05-31 15:09:22 -0500

Seen: 422 times

Last updated: Jun 03 '22