# Revision history [back]

Yes I am. I managed to solve my problem for now using the republish command of image_transport.

My code was pretty similar to the tutorial examples. My subscriber declaration:

_frameSubscriber = image_transport::ImageTransport(_nh).subscribe("/vision/image", 5, &WebVision::grabFrameFromMsg, this);


And implementation:

void WebVision::grabFrameFromMsg(const sensor_msgs::ImageConstPtr& msg)
{
sensor_msgs::CvBridge bridge;

frame = bridge.imgMsgToCv(msg, "bgr8");

/*rest of the code where I pass the IplImage* frame to my functions*/
}


So the notion was to save bag files from the camera input when testing, and then pass the "video" as input to the algorithms to test it while offline.

But CvBridge could only listen to raw image messages and I couldn't afford so much disk space to save long raw videos.

What I do now is to just run this:

\$ rosrun image_transport republish compressed in:=/vision/image _image_transport:=compressed raw out:=/vision/image


and listen to the republished topic with raw images. It could be better if I could implement the conversion inside the code, but it solves my problem for now.

Thanks anyway for giving it a try. :)