How to use ROS Navigation stack with a legged robot?
Does ROS Navigation stack only work for wheeled robots ? Is there a different navigation stack for legged robots?
If I wanted to use ROS Navigation stack with a legged robot, how would I do it exactly? Is it that I would need a controller which converts the incoming Twist messages into something that my legged robot understands (joint control commands)?
I'd be Glad to see suggested already implemented projects / implementations.
TIA
Asked by electrophod on 2021-07-05 05:07:55 UTC
Comments
I have no experience with legs but I would guess: Yes. Input cmd_vel, output odometry. The nao might be an example. It seems to use some nodes from humanoid_navigation For for examples I would search robots.ros.org
Asked by Humpelstilzchen on 2021-07-05 05:59:10 UTC
for traversal on flat ground, I think Navigation would serve you well and agree that you'd only need to write your own controller that subscribes to cmd_vel and have a custom odom publisher. For more complex terrain (e.g. stairs, ramps, rocks) where legged robots are more interesting than wheeled, I think you would likely want a different environment representation than a 2D costmap; at minimum, you would probably want a custom costmap plugin and planner... but that depends on your application I suppose.
Asked by shonigmann on 2021-07-05 11:05:07 UTC