Understanding image_transport

asked 2022-05-18 11:38:00 -0500

Standard gravatar image

Hello,

I'm a newbie and try to wrap my head around this: On my node (a RPI 3B, Ubuntu 20.04, Noetic) I want to capture a video stream from the video0 device and send it to my laptop in realtime. For this purpose, I'm using the package video_stream_opencv, which generally works. But - the bandwith is really high. Since the my nodes are only connected via 2.4Ghz wifi, my max bandwith is like 10Mbits. This results in a video stream which has 1-2 fps, which is far too low.

Now I read that for image transport the module image_transport should be used. Then I read that for compressed image (video) transport, theora_image_transport should be used. Fine - but how do I do this? For me it looks like those are C++ libraries. Do I really have to build my own c++ package? (I'm more like a Python guy)

If not, how do I use those packages to transport the images created by video_stream_opencv to my node?

I'm sorry if the solution is too obvious, but I really spent hours on this, I can't find an example or a clear guide on how to use those packages.

Thanks in advance, any help is appreciated! Markus

edit retag flag offensive close merge delete