Simulating real camera data
Hi guys,
Is there a way I can simulate camera data similarly to how rosbag works? I tried rosbag but was missing tf data so that didn't work.
is it as simple as adding another topic to record? looking through the available topics I don't see anything that looks like what I need.
In essence what I want to do is, record the data stream of my cameras to pass to my perception pipeline so I can start working on making the robot move to a pick location. I'd like to work at my desk if possible and not at the robot since there isn't much room for my laptop there.
Thanks
Did you record the tf topic in the bag file?
Yes, the error that I'm getting in RVIZ is: "Transform [sender=unknown_publisher] for frame [camera1_color_optical_frame]: Frame [camera1_color_optical_frame] does not exist."
does this mean something else maybe?
Have you tried recording every topic and playing that back to if that works?
I have not tried that, let me give that a try real quick and get back to you
When I subscribe to all on the rosbag with a -a it crashes my software and it wont run. I went back and rerecorded the cameras and the tf and increased the buffer so it stopped filling up and that still didnt work either.
This message means you don't have TF (http://wiki.ros.org/tf) between those cameras published. TF between 2 cameras most likely is the static one. I think you need to record both /tf and /tf_static. Also do rostopic echo /tf_static and check that TF between those cameras is there in the output.
Sorry it took so long to reply, your suggestion was correct. Thanks!