ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

The relation between pointcloud and rgb data in gazebo

asked 2022-02-18 09:54:31 -0500

kankanzheli gravatar image

updated 2022-02-18 20:03:40 -0500

Hello everyone, I designed a robot in gazebo. The robot is equipped with a depth camera Kinect. That is, the file "libgazebo_ros_openni_kinect.so" is called.

Is the output RGB image corresponding to the point cloud? (For example, does the top left corner of the point cloud correspond to the top left pixel of the RGB image?)

Please help me

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2022-02-18 14:44:32 -0500

shonigmann gravatar image

I believe the same frame is used as the origin of both the depth and rbg sensors.

If you need a more realistic model that includes the baseline between different sensors (e.g. the IR and RGB), consider looking at these models for realsense and kinect cameras. They define individual frames/plugins for each sensor

edit flag offensive delete link more

Comments

The Kinect camera I'm using on the. So I want to confirm whether the information obtained from "/rgb/ image_raw" and "/ depth_registered / points" correspond. That is, "/ depth_registered / points" is equivalent to the result of adding x, y and z coordinate information to color information(same as image from "/ rgb / image_raw").

kankanzheli gravatar image kankanzheli  ( 2022-02-18 19:55:49 -0500 )edit

Question Tools

1 follower

Stats

Asked: 2022-02-18 09:54:31 -0500

Seen: 188 times

Last updated: Feb 18 '22