JetBot AI navigation problem (realsense, imu, lidar)

asked 2021-03-22 06:37:22 -0600

balint.tahi gravatar image

Hi,

I am working on a small project with a JetBot AI kit, including: - Intel RealSense D435 camera (no IMU) - MPU 9250 IMU - YDLidar X4

I am using this package for the navigation/mapping/robot description: https://github.com/issaiass/jetbot_di...

But since this is only for gazebo and no real robot is included, I am using the following packages to control the motors: - https://github.com/masato-ka/ros_jetbot the diff drive motor speed calculation from this package - jetbot motor controller code to communicate with the motors - gamepad controller using two joy-s (one for linear, one for angular)

Gamepad -> Turtle sim works fine, I can control the turtle Gamepad -> Robot works fine (such as the turtle, my robot is moving, turning, etc), maybe some fine tuning should be included to match with 1 m/s for linear speed, 1 rad/s for angular speed, but currently I don't have an encoder on the wheels ... so this has to be "experimental".

I am using the jetbot_diff_drive package for navigation (AMCL, DWA local planner). For the odometry, I am using realsense2_ros together with rtabmap and rtimulab_ros package, the fusion of the odometries are done with robot_localization package.

Basically I can create a map with Hector, and I am using this map with the Navigation.

I have two ROS machines (both are using ROS Melodic) in the system: - Jetbot (realsense, imu, motor control, master) - PC (navigation, rviz, etc)

The robot description seems to be fine, tf seems to be fine, but I have a tons of problems ofc ...: 1) after I make a pose estimation, the laser scan - map matching fails. 2) the robot is not really "turning", it starts to move forward, but nothing else (I can drive with the gamepad, with cmd_vel commands) 3) the position is jumping around the map, therefore the planner always do some replanning ... and after some time it losts tracking I guess 4) .....

I am a bit noob in robotics and autonomous drive robots but I can understand ROS. Currently I am a bit lost where to chech and what and how to proceed.

I had performace issues on the jetbot, so basically that is why I moved everything to a separate PC and kept only the most important things on it (reading the sensors and controlling the motor).

The HW itself works fine, drivers are feeding the system with data (IMU, realsense, lidar), I think everything is in place in the robot description, so the transformation should be ok.

Is there anyone who could help me out?

Thanks!

launch file on the jetbot:

<launch>
  <node pkg="jetbot_ros_camera" type="jetbot_camera" name="jetbot_camera"/>
  <node pkg="jetbot_motors" type="jetbot_robot_controller.py" name="jetbot_motors" output="screen" />
  <node pkg="jetbot_gamepad" type="gamepad.py" name="gamepad"/>
  <include file="$(find rtimulib_ros)/launch/rtimulib_ros.launch" />

  <include file="$(find realsense2_camera)/launch/rs_camera.launch">
        <arg name="align_depth"         value="true"/>
        <arg name="enable_infra"        value="false"/>
        <arg name="enable_sync"         value="true"/>
        <arg name="camera"              value="realsense_d435"/> ...
(more)
edit retag flag offensive close merge delete