ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Yes I am. I managed to solve my problem for now using the republish command of image_transport.

My code was pretty similar to the tutorial examples. My subscriber declaration:

_frameSubscriber = image_transport::ImageTransport(_nh).subscribe("/vision/image", 5, &WebVision::grabFrameFromMsg, this);

And implementation:

void WebVision::grabFrameFromMsg(const sensor_msgs::ImageConstPtr& msg)
    sensor_msgs::CvBridge bridge;

    frame = bridge.imgMsgToCv(msg, "bgr8");
    frameTimestamp = msg->header.stamp;

    /*rest of the code where I pass the IplImage* frame to my functions*/

So the notion was to save bag files from the camera input when testing, and then pass the "video" as input to the algorithms to test it while offline.

But CvBridge could only listen to raw image messages and I couldn't afford so much disk space to save long raw videos.

What I do now is to just run this:

$ rosrun image_transport republish compressed in:=/vision/image _image_transport:=compressed raw out:=/vision/image

and listen to the republished topic with raw images. It could be better if I could implement the conversion inside the code, but it solves my problem for now.

Thanks anyway for giving it a try. :)