gstreamer appsink buffer to ROS CompressedImage
Hi, I'm building this GStreamer pipeline:
v4l2src device=/dev/video0 ! image/jpeg,width=1280,height=720,framerate=30/1 ! jpegdec ! videoconvert ! appsink
Then, I want to publish the frames read from appsink to a CompressedImage topic, so I use a gstreamer callback to the signal "new-sample":
/* The appsink has received a buffer */
static GstFlowReturn new_sample (GstElement *sink, CustomData *data) {
GstSample *sample;
/* Retrieve the buffer */
cout << "Pulling sample from: " << gst_element_get_name(sink) << endl;
g_signal_emit_by_name (sink, "pull-sample", &sample);
if (sample) {
// get the buffer and the inner memory
GstBuffer* buf = gst_sample_get_buffer(sample);
GstMemory *memory = gst_buffer_get_memory(buf, 0);
GstMapInfo info;
gst_memory_map(memory, &info, GST_MAP_READ);
gsize &buf_size = info.size;
guint8* &buf_data = info.data;
// Stop on end of stream
if (!buf) {
g_print("appsink Buffer is not ok");
return GST_FLOW_ERROR;
}
sensor_msgs::CompressedImage msg;
msg.format = "jpeg";
msg.data.resize(buf_size);
cout << "Buffer Size: " << buf_size << endl;
std::copy(buf_data, (buf_data)+(buf_size), msg.data.begin());
data->img_pub.publish(msg);
gst_memory_unmap(memory, &info);
gst_memory_unref(memory);
gst_sample_unref (sample);
return GST_FLOW_OK;
}
return GST_FLOW_ERROR;
}
Where "data" is a struct:
/* Structure to contain all our information, so we can pass it to callbacks */
typedef struct CustomData {
GstElement *pipeline;
GstElement *v4l2src; // v4l2src element
GstElement *caps_filter_1; // capsfilter element
GstElement *jpeg_dec;
GstElement *video_convert; // converter element
GstElement *app_sink;
ros::NodeHandle nh;
ros::Publisher img_pub;
ros::Subscriber img_sub;
} CustomData;
The publisher is declared in this way:
data.img_pub = data.nh.advertise<sensor_msgs::CompressedImage>("/test_cam/image_raw/compressed", 1);
Then, I need to convert the CompressedImage messages to OpenCV and I'm using cv_bridge for that.
In a Python script, I subscribe to /test_cam/image_raw/compressed
and use this callback:
def cbk_compressed(msg):
frame = bridge.compressed_imgmsg_to_cv2(msg)
...
But the obtained frame
is None
.
Also, for debugging, I subscribe to /test_cam/image_raw/compressed
from the node that publishes the CompressedImage messages using this callback:
void compressed_image_cb(const sensor_msgs::CompressedImageConstPtr& msg) {
cout << "RECEIVED COMPRESSED IMAGE MSG" << endl;
cv_bridge::CvImagePtr cv_ptr;
cout << "RECEIVED BUFFER SIZE: " << msg->data.size() << endl;
try {
cv_ptr = cv_bridge::toCvCopy(msg, sensor_msgs::image_encodings::BGR8);
}
catch (cv_bridge::Exception& e) {
ROS_ERROR("cv_bridge exception: %s", e.what());
}
}
but when the callback is executed, I get this error:
RECEIVED COMPRESSED IMAGE MSG
RECEIVED BUFFER SIZE: 1382400
terminate called after throwing an instance of 'cv::Exception'
what(): OpenCV(4.2.0) ../modules/imgproc/src/color.cpp:182: error: (-215:Assertion failed) !_src.empty() in function 'cvtColor'
I searched everywhere but I don't find any solution. Also, I'm using the same method that gscam is using here: https://github.com/ros-drivers/gscam/blob/1b0b8e51b91522edadd8399d4ad920f8afbc487d/src/gscam.cpp#L339 to send compressed image messages (I tried with sensors_msgs::CompressedImagePtr but it does not change anything).
Anyone have some suggestions?
I'm doing this because I have to build a bigger pipeline where I read two cameras and then publish the frames (unmodified) + one frame that is the result of some processing on both frames. I am not using two gscam because I don't want the overhead caused by publishing (with gscam) and subscribing to the frames and then process them. I need the best fps available with minimum delay from when I read the two cameras to when the frames are published.
Thank you!
Asked by davide.cremona on 2022-05-04 11:53:02 UTC
Comments
The fact that the failure is in
cvtColor
suggests that there's a problem with your gstreamer pipeline. Have you gotten the pipeline working with gscam? I know that this rather simple pipeline works in ROS2, it should also work in ROS1: https://github.com/ros-drivers/gscam/blob/ros2/examples/v4ljpeg.launch.xmlAsked by clyde on 2022-06-22 11:04:58 UTC