ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Recommendations for visual-inertial-lidar odometry pipeline for legged robots

asked 2021-03-14 21:58:32 -0600

buzzport93 gravatar image

updated 2021-03-14 22:04:39 -0600


I am working on robotics project mainly with legged robots.

Currently, my work is mostly focused with Unitree's A1 quadruped. I have explored into Unitree A1's SLAM functionalities, but most of it is out of black box state estimator & lidar odometry. The default SLAM package utilizes SLAMTEC's 2D rplidar, which is good enough to map the 2D environment but it didn't quite suit my purpose. I also did not like how it relies on legged odometry.

Therefore, I started a new project of making a modular SLAM box. This way, the new SLAM box can allow me to not rely on leg odometry / kinematics for state estimation and have the SLAM box modular so that I can attach it to any robots I want.

For the components, I decided on the following:

  1. PC: Intel NUC 10i7FNH Barebone System... paired with

1.1.RAM: Samsung 32GB DDR4 2666MHz 1.2.SSD: Samsung 970 PRO SSD 512GB - M.2

  1. LIDAR:Ouster OS0-32 Channel - large vertical FOV is what I needed for legged robots, so I chose OS0 instead of OS1

  2. IMU:VectorNav VN-100 Rugged Dev Kit

  3. Camera:Intel D435i - I plan to use camera mainly for object classification & semantic mapping. That's why I picked consumer-grade device rather than expensive ones. I also remember seeing another camera laying around in MRDC, can't remember which model it was. We may be able to use that one if no one is utilizing it to save the overall cost.

Please refrain from commenting on why velodyne wasn't picked.. Jetson wasn't picked... etc. The main focus is to think about which state of the art pipeline to utilize for legged robots. I would like to use lidar & imu for SLAM, and camera as more of tool for object classification & semantic segmentation. Camera still can be used to improrve SLAM when features are sparse for lidar, but I think lidar sensing will be pretty reliable.

Also, I have used MIT's Kimera-VIO with legged robots before, and the results were far from being accurate. Legged robots typically has more noise in IMU as vibration is constant/large as compared to drones/aerial vehicles, which is what Kimera was designed for.

Your inputs would be appreciated, and please consider that this project will be for legged robots, not drones / wheeled robots / autonomous cars.

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted

answered 2022-11-25 00:18:23 -0600

Join us, we have a small community of Quadruped users, including Unitree owners. Stop on by!

edit flag offensive delete link more

Question Tools



Asked: 2021-03-14 21:58:32 -0600

Seen: 534 times

Last updated: Nov 25 '22