Ask Your Question
2

Stereo Camera syncronization

asked 2012-04-29 03:38:50 -0600

oleo80 gravatar image

updated 2012-04-29 04:47:51 -0600

joq gravatar image

Hi all, I'm using the uvc_camera package and its stereo_node to grab frames from my 2 webcams. Now I'm trying to have a measure of the sync error of the 2 frames doing the following: 1) write informations about left and right camera in 2 files

rostopic echo -p /left/camera_info > left.csv
rostopic echo -p /right/camera_info > right.csv

2) compare timestamps field inside left.csv and right.csv

With my big surprise, timestamps are always the same for all frames couples (30000+ couples, 15fps), saying that there's no sync error. I think that this result is quite impossible because timestamps are taken with a precision of 1 nanosecond.

Now the question is: does uvc_camera has some magical power that I don't know or something else happened?

Is there another way to have a precise mesure of the sync error?

Thanks, Fabio.

edit retag flag offensive close merge delete

Comments

2

Interesting question. Although timestamps are represented using nanosecond precision, actual clocks are not that accurate. I would not expect USB cameras to be synchronized.

joq gravatar imagejoq ( 2012-04-29 04:52:11 -0600 )edit
2

On a 1394 bus, some camera models support "bus synchronization" with accuracy of about 125 microseconds. Does USB have any comparable feature?

joq gravatar imagejoq ( 2012-04-29 04:52:42 -0600 )edit

I think that "bus synchronization" is a special feature of IEEE1394 and probably it's not available with USB. The uvc_camera "includes a two-camera node that provides rough synchronization for stereo vision" so the frame synchronization is done through software.

oleo80 gravatar imageoleo80 ( 2012-04-30 02:29:36 -0600 )edit

Looking at the frames I can't say that they aren't synchronized... by the way I'm expecting a sync error about some microseconds.

oleo80 gravatar imageoleo80 ( 2012-04-30 02:30:42 -0600 )edit

@joq USB Video Class doesn't have that feature, nor does it have triggering.

Ken gravatar imageKen ( 2012-04-30 06:20:28 -0600 )edit

2 Answers

Sort by ยป oldest newest most voted
1

answered 2012-04-30 06:18:51 -0600

Ken gravatar image

stereo_node grabs the latest image from each camera and then stamps both images with the current ROS time. That timestamp won't be all that close to the true capture time for either camera, as it's generated after the camera-to-computer transfer and the images may be delayed due to buffering or other processes.

I don't think there was any way to extract better timestamps from the Video4Linux driver; I'd like to at least get a timestamp that indicates when the computer finished receiving the image. The USB camera protocol does define some timing information, though, so I believe the information is there on some level.

edit flag offensive delete link more
0

answered 2012-05-01 20:46:38 -0600

oleo80 gravatar image

updated 2012-05-08 03:16:34 -0600

I've looked at the code and I've seen that frames are grabbed sequentially but timestamps are taken only after the overall grab operation. I think that should be better to set left timestamp just after the left frame grab and the right one after the right grab. This can cause a small overhead because timestams are set also for discarded frames.


UPDATE: I'm going on studying the synchronization error. I've modified the code of the stereo_node in order to take and publish real timestamps for the left and the right frame grabbing. Let's look at the code of /uvc_camera/src/stereo.cpp

void StereoCamera::sendInfo(ros::Time time) {
  CameraInfoPtr info_left(new CameraInfo(left_info_mgr.getCameraInfo()));
  CameraInfoPtr info_right(new CameraInfo(right_info_mgr.getCameraInfo()));

  info_left->header.stamp = time;
  info_right->header.stamp = time;
  info_left->header.frame_id = frame;
  info_right->header.frame_id = frame;

  left_info_pub.publish(info_left);
  right_info_pub.publish(info_right);
}


void StereoCamera::feedImages() {
  unsigned int pair_id = 0;
  while (ok) {
    unsigned char *frame_left = NULL, *frame_right = NULL;
    uint32_t bytes_used_left, bytes_used_right;

    ros::Time capture_time = ros::Time::now();

    int left_idx = cam_left->grab(&frame_left, bytes_used_left);
    int right_idx = cam_right->grab(&frame_right, bytes_used_right);

    /* Read in every frame the camera generates, but only send each
     * (skip_frames + 1)th frame. This reduces effective frame
     * rate, processing time and network usage while keeping the
     * images synchronized within the true framerate.
     */
    if (skip_frames == 0 || frames_to_skip == 0) {
      if (frame_left && frame_right) {
    ImagePtr image_left(new Image);
    ImagePtr image_right(new Image);

    image_left->height = height;
    image_left->width = width;
    image_left->step = 3 * width;
    image_left->encoding = image_encodings::RGB8;
    image_left->header.stamp = capture_time;
    image_left->header.seq = pair_id;

    image_right->height = height;
    image_right->width = width;
    image_right->step = 3 * width;
    image_right->encoding = image_encodings::RGB8;
    image_right->header.stamp = capture_time;
    image_right->header.seq = pair_id;

    image_left->header.frame_id = frame;
    image_right->header.frame_id = frame;

    image_left->data.resize(image_left->step * image_left->height);
    image_right->data.resize(image_right->step * image_right->height);

    if (rotate_left)
      rotate(&image_left->data[0], frame_left, width * height);
    else
      memcpy(&image_left->data[0], frame_left, width * height * 3);

    if (rotate_right)
      rotate(&image_right->data[0], frame_right, width * height);
    else
      memcpy(&image_right->data[0], frame_right, width * height * 3);

    left_pub.publish(image_left);
    right_pub.publish(image_right);

    sendInfo(capture_time);

    ++pair_id;
      }

      frames_to_skip = skip_frames;
    } else {
      frames_to_skip--;
    }

    if (frame_left)
      cam_left->release(left_idx);
    if (frame_right)
      cam_right->release(right_idx);
  }
}

I've modified it setting two different timestamps getting very strange results. I'm using the following code:

void StereoCamera::sendInfo(ros::Time time_left, ros::Time time_right) {
  CameraInfoPtr info_left(new CameraInfo(left_info_mgr.getCameraInfo()));
  CameraInfoPtr info_right(new CameraInfo(right_info_mgr.getCameraInfo()));

  info_left->header.stamp = time_left;
  info_right->header.stamp = time_right;
  info_left->header.frame_id = frame;
  info_right->header.frame_id = frame;

  left_info_pub.publish(info_left);
  right_info_pub.publish(info_right);
}


void StereoCamera::feedImages() {
  unsigned int pair_id = 0;
  while (ok) {
    unsigned char *frame_left = NULL, *frame_right = NULL;
    uint32_t bytes_used_left, bytes_used_right;


    ros::Time capture_time_left = ros::Time::now();
    int left_idx = cam_left->grab(&frame_left, bytes_used_left);


    ros::Time capture_time_right = ros::Time::now();
    int right_idx = cam_right->grab(&frame_right, bytes_used_right);

    /* Read in every frame the camera generates, but only send each
     * (skip_frames + 1)th frame. This reduces effective frame
     * rate, processing time and network usage while keeping the
     * images synchronized within the true framerate.
     */
    if (skip_frames == 0 || frames_to_skip == 0 ...
(more)
edit flag offensive delete link more

Comments

"Now I'm going to organize data and tomorrow I'll write them here." Do you have some results already?

Kozuch gravatar imageKozuch ( 2014-08-04 11:36:58 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools

2 followers

Stats

Asked: 2012-04-29 03:38:50 -0600

Seen: 2,509 times

Last updated: May 08 '12