I cant visualise point cloud when i run example.cpp from ros_pcl tutorial?
I just tried the ros_pcl first tutorial. everything goes well till catkin_make. but after launching the camera node and running command :rosrun my_pcl_tutorial example input:=/camera/depth/points, it shows nothing. after running rosrun rviz rviz also it shows nothing. i am using kinect xbox, with ros indigo, ubuntu 14.04.
im getting this message after i run freenect launch:
[ WARN] [1465146330.631433542]: Could not find any compatible depth output mode for 1. Falling back to default depth output mode 1.
[ INFO] [1465146330.648912116]: rgb_frame_id = 'camera_rgb_optical_frame'
[ INFO] [1465146330.648995078]: depth_frame_id = 'camera_depth_optical_frame'
[ WARN] [1465146330.663642540]: Camera calibration file /home/dinesh/.ros/camera_info/rgb_A00361A05094131A.yaml not found.
[ WARN] [1465146330.663784508]: Using default parameters for RGB camera calibration.
[ WARN] [1465146330.663881921]: Camera calibration file /home/dinesh/.ros/camera_info/depth_A00361A05094131A.yaml not found.
[ WARN] [1465146330.663961395]: Using default parameters for IR camera calibration.
[ INFO] [1465146333.516454011]: Stopping device RGB and Depth stream flush.
Although this are just warning, and the kinect is working propersly, why do i get nothing when i run the example.cpp. it shows nether error nor any message. even after running rvij nothing. i just got adopter for my kinect and tried to run this. before i've build the package and their was no error at that time. but now. o man.