How to use ROS Navigation stack with a legged robot?

asked 2021-07-05 05:07:55 -0600

electrophod gravatar image

Does ROS Navigation stack only work for wheeled robots ? Is there a different navigation stack for legged robots?

If I wanted to use ROS Navigation stack with a legged robot, how would I do it exactly? Is it that I would need a controller which converts the incoming Twist messages into something that my legged robot understands (joint control commands)?

I'd be Glad to see suggested already implemented projects / implementations.


edit retag flag offensive close merge delete



I have no experience with legs but I would guess: Yes. Input cmd_vel, output odometry. The nao might be an example. It seems to use some nodes from humanoid_navigationFor for examples I would search

Humpelstilzchen gravatar image Humpelstilzchen  ( 2021-07-05 05:59:10 -0600 )edit

for traversal on flat ground, I think Navigation would serve you well and agree that you'd only need to write your own controller that subscribes to cmd_vel and have a custom odom publisher. For more complex terrain (e.g. stairs, ramps, rocks) where legged robots are more interesting than wheeled, I think you would likely want a different environment representation than a 2D costmap; at minimum, you would probably want a custom costmap plugin and planner... but that depends on your application I suppose.

shonigmann gravatar image shonigmann  ( 2021-07-05 11:05:07 -0600 )edit