RGB and Depth sensor_msgs::Image vs XYZRGB sensor_msgs::PointCloud2
Hello,
So I have ROS running across multiple machines on a limited network, and I have two possibilities, transfer a XYZRGB sensor_msgs::PointCloud2 directly or two sensor_msgs::Image (one rgb and one depth) and assemble the cloud remotely from the RGBD data and the camera intrinsics. The assembled cloud is equal to the original sensor_msgs::PointCloud2, i checked. From the data I collected, I noticed that transferring two sensor_msgs::Image was 85% faster in the worst case scenario, and i dont know why.
Is there any reason to why it would be faster?
Thanks in advance.