Navigation of wall-climbing robot

asked 2021-03-20 14:44:32 -0500

zic123 gravatar image

Hello everybody,

I have a question regarding the navigation of my mobile robot and would be really grateful if you could steer me in the right direction. First, I'd like to describe my robot and situation. I am building a wall-climbing-robot for the bridge inspection. Robot will be equipped with GPR (georadar) and it will have to follow grid-like trajectories. Below you can find examples of this trajectory, these are simple straight-line movements.

image description

The robot is equipped with odometry from wheel encoders, IMU, and optical flow sensors. The plan is to then improve odometry by fusing these sensors using EKF (robot_localization package). Regarding SLAM and usage of LiDAR, unfortunately, I see no point in that, as when a robot will be at some higher distance from the bottom of the pillar (e.g. 15 m), it is not surrounded by any features. So, for the preliminary type of measurements, the plan is to position the robot to an arbitrary location on a bridge pillar and set it as a starting point of a robot. Then have the robot move in a mentioned grid-like pattern based only on odometry from an arbitrary starting point. My question is, can I use a navigation stack if I can only use odometry and not a LiDAR/SLAM/map? I don't need obstacle avoidance for these preliminary testings. Or there is no need for a navigation stack and I need some kind of trajectory generator, if yes, is there a ROS package already implemented? The plan is to provide the trajectory planner with an array of setpoints and for it to complete this straight-line movement regarding velocity limits?

And if you have any suggestion regarding what else sensors to use, please let me know, as this is a specific case and for now we only see odometry(from encoders, IMU, and optical flow) as a main source of localization, but as we know, this is prone to drift over time, especially in WCR robots where there will be a lot of vibrations (bad for IMU) and wheel slippage (bad for encoders). For now, we have concluded, LiDAR can't be used, GPR signal will be weak under the bridge, Visual odometry can't be used (not a lot of features), marker-based localization defeats the purpose of using the robot in the first place.

Thank you, everybody!

edit retag flag offensive close merge delete