Ultrasound and IR sensors vs Kinect for Robot navigation
Hi all,
I have a mobile robot and I would like it to navigate around the building. I already have a 2D map of the building. I have Rotational encoders to get the odometry information and IMU/UWB for localization. I only have Ultrasound, IR sensors and Kinect which I can use for navigation. I want to know which is better for navigation (using Ultrasound and IR sensors or Kinect) given that I am aiming for pretty good accuracy as well as it should not be very computationally expensive. In my opinion, Kinect will do a better job but my concern with Kinect is that it might be computationally very expensive as compared to Proximity sensors given that I have to run it on the NVIDIA Jetson TK1 board ( https://developer.nvidia.com/jetson-tk1 ) but then again if I go with Proximity sensors, I have to use bunch of them and I don't know how effective and efficient that will be. Also, I am little worried about the dead band in case of Kinect which is around 50 cm which is way more than the dead band for Ultrasound sensors (~ 10 - 15 cm).
Any guidance regarding this will be appreciated.
Update 1: Can Kinect sensor be used for mobile robot navigation when there is a glass wall? I think it can not be used but I am not sure.
Thanks in advance.
Naman
Roomba! :)