JetBot AI navigation problem (realsense, imu, lidar)
Hi,
I am working on a small project with a JetBot AI kit, including: - Intel RealSense D435 camera (no IMU) - MPU 9250 IMU - YDLidar X4
I am using this package for the navigation/mapping/robot description: https://github.com/issaiass/jetbot_diff_drive
But since this is only for gazebo and no real robot is included, I am using the following packages to control the motors: - https://github.com/masato-ka/ros_jetbot the diff drive motor speed calculation from this package - jetbot motor controller code to communicate with the motors - gamepad controller using two joy-s (one for linear, one for angular)
Gamepad -> Turtle sim works fine, I can control the turtle Gamepad -> Robot works fine (such as the turtle, my robot is moving, turning, etc), maybe some fine tuning should be included to match with 1 m/s for linear speed, 1 rad/s for angular speed, but currently I don't have an encoder on the wheels ... so this has to be "experimental".
I am using the jetbotdiffdrive package for navigation (AMCL, DWA local planner). For the odometry, I am using realsense2ros together with rtabmap and rtimulabros package, the fusion of the odometries are done with robot_localization package.
Basically I can create a map with Hector, and I am using this map with the Navigation.
I have two ROS machines (both are using ROS Melodic) in the system: - Jetbot (realsense, imu, motor control, master) - PC (navigation, rviz, etc)
The robot description seems to be fine, tf seems to be fine, but I have a tons of problems ofc ...: 1) after I make a pose estimation, the laser scan - map matching fails. 2) the robot is not really "turning", it starts to move forward, but nothing else (I can drive with the gamepad, with cmd_vel commands) 3) the position is jumping around the map, therefore the planner always do some replanning ... and after some time it losts tracking I guess 4) .....
I am a bit noob in robotics and autonomous drive robots but I can understand ROS. Currently I am a bit lost where to chech and what and how to proceed.
I had performace issues on the jetbot, so basically that is why I moved everything to a separate PC and kept only the most important things on it (reading the sensors and controlling the motor).
The HW itself works fine, drivers are feeding the system with data (IMU, realsense, lidar), I think everything is in place in the robot description, so the transformation should be ok.
Is there anyone who could help me out?
Thanks!
launch file on the jetbot:
<launch>
<node pkg="jetbot_ros_camera" type="jetbot_camera" name="jetbot_camera"/>
<node pkg="jetbot_motors" type="jetbot_robot_controller.py" name="jetbot_motors" output="screen" />
<node pkg="jetbot_gamepad" type="gamepad.py" name="gamepad"/>
<include file="$(find rtimulib_ros)/launch/rtimulib_ros.launch" />
<include file="$(find realsense2_camera)/launch/rs_camera.launch">
<arg name="align_depth" value="true"/>
<arg name="enable_infra" value="false"/>
<arg name="enable_sync" value="true"/>
<arg name="camera" value="realsense_d435"/>
<arg name="depth_width" value="424"/>
<arg name="depth_height" value="240"/>
<arg name="depth_fps" value="6"/>
<arg name="color_width" value="424"/>
<arg name="color_height" value="240"/>
<arg name="color_fps" value="6"/>
<arg name="initial_reset" value="true"/>
</include>
<include file="$(find ydlidar_ros)/launch/X4.launch" >
</include>
</launch>
launch file for the PC:
<launch>
<arg name="map" value="/home/ubuntu/workspace/maps/map.yaml"/>
<include file="$(find jetbot_navigation)/launch/jetbot_navigation.launch" >
<arg name="gps_enable" value="false"/>
<arg name="ultrasonic_enable" value="false"/>
<arg name="map_file" value="$(arg map)"/>
</include>
<include file="$(find rtabmap_ros)/launch/rtabmap.launch">
<arg name="args" value="--delete_db_on_start"/>
<arg name="frame_id" value="base_footprint"/>
<arg name="rgb_topic" value="/realsense_d435/color/image_raw"/>
<arg name="depth_topic" value="/realsense_d435/aligned_depth_to_color/image_raw"/>
<arg name="camera_info_topic" value="/realsense_d435/color/camera_info"/>
<arg name="depth_camera_info_topic" value="/realsense_d435/depth/camera_info"/>
<arg name="rtabmapviz" value="false"/>
<arg name="rviz" value="false"/>
</include>
<include file="$(find robot_localization)/launch/ukf_template.launch" />
<param name="/ukf_se/frequency" value="30"/>
<param name="/ukf_se/use_control" value="false"/>
<param name="/ukf_se/base_link_frame" value="base_footprint"/>
<param name="/ukf_se/odom0" value="rtabmap/odom"/>
<rosparam param="/ukf_se/odom0_config">[true,true,true,
true,true,true,
true,true,true,
true,true,true,
true,true,true]
</rosparam>
<param name="/ukf_se/imu0" value="imu/data"/>
<rosparam param="/ukf_se/imu0_config">[true,true,true,
true,true,true,
true,true,true,
true,true,true,
true,true,true]
</rosparam>
<remap from="odometry/filtered" to="/odom"/>
</launch>
Asked by balint.tahi on 2021-03-22 06:37:22 UTC
Comments