Ask Your Question

Raspberry Pi openni USB interface not supported

asked 2013-05-15 21:16:39 -0600

cyrus gravatar image

updated 2014-04-20 14:09:20 -0600

ngrennan gravatar image

I have managed to successfully install ros groovy on my raspberry pi and openni. Unfortunately when I connect my Asus Xtion to the raspberry pi (with a powered hub), it does detect the sensor, but I receive and error of USB interface not supported. Is there any way that I can fix this error?

edit retag flag offensive close merge delete

3 Answers

Sort by ยป oldest newest most voted

answered 2013-05-16 04:24:42 -0600

kalectro gravatar image

updated 2013-05-19 07:31:14 -0600

I did not get it to work with OpenNI either but OpenNI2 works with some minor changes. It should compile out of the box from my repo using make:

I also wrote an openni2_camera package which runs on the raspberry and can stream the depth image as a ROS message with 30fps (when overclocked)

After building OpenNI2 successfully and catkin_make ran through, navigate into OpenNI2/Bin/ReleaseX and call rosrun openni2_camera openni2_camera_node from here. Otherwise it will not find the Openni2 libraries

edit flag offensive delete link more


I will definitely try this out soon. Thanks

cyrus gravatar image cyrus  ( 2013-05-16 10:44:58 -0600 )edit

Could it work only with the Asus Xtion? or could also with the Kinect?

danielq gravatar image danielq  ( 2013-05-17 09:14:07 -0600 )edit

I did not get it to work with the Kinect

kalectro gravatar image kalectro  ( 2013-05-18 20:22:35 -0600 )edit

@kalectro: maybe your openni2 have a problem with calibration, I can not set using my calibration, it still use defaut calibration, but I am working with Linux Arm and the resolution of depth in ARM just 160*120, but with rgb it is 320*240. Do you know how to I fix it?

domikilo gravatar image domikilo  ( 2014-03-14 22:25:13 -0600 )edit

answered 2013-10-14 15:21:28 -0600

John Setzer gravatar image

I have generated a remote point cloud. Thanks to piggy backing Kalectro's work I was able to stream the depth image to a laptop (as ros master) that I simply copied and reduced all openni_launch files to just depth processing in the processing include launch file changing the camera parameter to openni2_camera. My spin was transporting the stream off my Mike Ferguson Arduino based robot with a second Raspberry Pi acting as a wireless bridge with a cat5 crossover cable to the XTION connected Pi (because its USB is to bogged to transmit 802.11 usb). Thanks Mike and Kalectro for a cool Raspberry Pi Turtlebot Navigation capable robot!

edit flag offensive delete link more

answered 2013-05-19 06:19:35 -0600

cyrus gravatar image

so kalectro i tried to run your openni2_camera_node and i get the following errors. where do you think Im going wrong here?

[ INFO] [1368968043.171416263]: creating image_transport... this might take a while...
[ERROR] [1368968049.678261385]: Tried to advertise a service that is already advertised in this node [/openni2_camera/set_camera_info]
[ INFO] [1368968049.696679863]: using default calibration URL
[ INFO] [1368968049.716394302]: camera calibration URL: file:///home/pi/.ros/camera_info/rgb.yaml
[ERROR] [1368968049.727926974]: Unable to open camera calibration file [/home/pi/.ros/camera_info/rgb.yaml]
[ WARN] [1368968049.743592530]: Camera calibration file /home/pi/.ros/camera_info/rgb.yaml not found.
[ WARN] [1368968049.755016205]: Using default parameters for RGB camera calibration.
[ INFO] [1368968049.771172746]: using default calibration URL
[ INFO] [1368968049.786680305]: camera calibration URL: file:///home/pi/.ros/camera_info/depth.yaml
[ERROR] [1368968049.790892185]: Unable to open camera calibration file [/home/pi/.ros/camera_info/depth.yaml]
[ WARN] [1368968049.802418858]: Camera calibration file /home/pi/.ros/camera_info/depth.yaml not found.
[ WARN] [1368968049.825350207]: Using default parameters for IR camera calibration.
[ERROR] [1368968049.832703998]: Device could not be initialized because     Found no files matching './OpenNI2/Drivers/lib*.so'
edit flag offensive delete link more


I updated my answer but this information with a little more explanation should also be in my README file in the openni2_camera repo

kalectro gravatar image kalectro  ( 2013-05-19 07:33:32 -0600 )edit

I did read it, it was a bit confusing but now I understand and it works!!! :) thank you kalectro. Ive been spending weeks trying to fix this!!

cyrus gravatar image cyrus  ( 2013-05-19 14:11:15 -0600 )edit

I was confused as well which might be the reason why the documentation does not make that much sense. I am glad that it now works for you. Would you be so kind and edit the Readme to make it clearer? I would then merge the changes

kalectro gravatar image kalectro  ( 2013-05-19 14:17:58 -0600 )edit

Alright I did that. I hope its alright. So kalectro can you tell me how you can produce and view pointcloud on rviz from the depth images your openni2_camera_node streams. I am not very familiar with what it takes in the background to achieve that. With openni.launch, getting pointclouds was simple

cyrus gravatar image cyrus  ( 2013-05-20 08:57:36 -0600 )edit

I did not produce point clouds yet but this should be possible with the use of PCL. Feel free to try to adapt the code from openni_camera to also output a point cloud. I think this could be useful for other people as well

kalectro gravatar image kalectro  ( 2013-05-20 09:01:21 -0600 )edit

I managed to run openni_launch successfully on the raspberry pi now. follow the steps on this link to install the Xtion drivers on the pi.

cyrus gravatar image cyrus  ( 2013-06-15 23:34:09 -0600 )edit

@cryus are you able to help me install openni_launch on the rpi please?

boog gravatar image boog  ( 2013-08-08 13:32:23 -0600 )edit

hi boog. Unfornunately I have installed openni_launch months ago on ros groovy so i barely remember what I did. what I can tell you about openni_launch is that i manage to get the depth images from the xtion but with a lower frame rate. Unfortunately openni_launch crashes when I try to get rgb images from the camera. So I suggest you mabye follow what Kalectro has done and use that. and generating point clouds on the pi itself is extremly slow. I let that process happen on my computer rather than the pi.

cyrus gravatar image cyrus  ( 2013-08-25 00:56:30 -0600 )edit

Your Answer

Please start posting anonymously - your entry will be published after you log in or create a new account.

Add Answer

Question Tools



Asked: 2013-05-15 21:16:39 -0600

Seen: 3,560 times

Last updated: Oct 14 '13