ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

asked 2019-03-04 06:26:33 -0500

eMrazSVK gravatar image

updated 2019-03-05 12:24:55 -0500

Hello everyone,

I am trying to fuse a visual odometry with Pixhawk IMU. For those who don't know, Pixhawk is an autopilot used for drone in this case. There is ROS integration provided thanks to mavros package, so I can get standard imu message from the Pixhawk. I am using only the IMU data from Pixhawk. I am using ROS Kinetic.

Intel already has a camera with built-in IMU, which is D435i. They provided example of launch file with exact same config as I want to use (realsense, rtabmap, robot_localization). Only difference is the IMU used. The launch file is here: opensource_tracking.launch

Since I am using different IMU, I changed the code a little bit: look here I launch node for gathering data from Pixhawk elsewhere, so it cannot be found in the launch file above.

So to the problem. I am able to run the rtabmap, create the 3D map of environment, localize myself with the visual odometry itself. I am able to get the imu data from pixhawk, when I echo the pixhawk imu data topic. Everything runs without errors. Although I am not sure, whether the config is correct. I attached the Pixhawk and RealSense camera to a plastic plate. I thought it was operating correctly, but when I detach Pixhawk and move it around, odometry does not change (I visualize odometry in RViz). The odom topic I try to visualize is /odometry/filtered, which is output of robot_localization package, so I guess this odometry should be output of merged imu and visual odometry. So to sum up: I am not sure if the IMU affects the odometry at all. I attach rqt_graph and tf view_frames for better understanding what is happening.

P.S. I may have forgotten to write everything about this issue, so please ask for additional info, if it is needed. I really appreciate any help! Thanks RQT_GRAPH

TF VIEW_FRAMES

edit retag flag offensive close merge delete

Comments

@eMrazSVK: please attach your images directly to your question. I've given you sufficient karma to do that.

gvdhoorn gravatar image gvdhoorn  ( 2019-03-05 08:23:48 -0500 )edit

Thanks, done.

eMrazSVK gravatar image eMrazSVK  ( 2019-03-05 12:25:28 -0500 )edit

3 Answers

Sort by ยป oldest newest most voted
0

answered 2019-03-08 07:36:46 -0500

eMrazSVK gravatar image

I am answering my own question.

So far, I guess it is working, let's say in principle. It does not work how I, or anybody would like it to work. It means that it lacks tuning, which I have to do next, but it is not within the context of this question. So I will post steps, how you can run mapping and localization using following tools:

  • Use Pixhawk's IMU (mavros package)
  • Intel RealSense D435 for Visual Odometry and RGBD data (realsense2_camera package)
  • robot_localization package for fusing IMU and Visual Odometry with UKF
  • rtabmap package for creating map and running Visual Odometry

Below you can find steps which are needed to run all this. I write it as a single commands, which you should run in separate terminals. I know, that launch files are made for this, but I've just had a one big mess from this, so I needed to break the steps down. Here you are:

  1. Launch node for gathering data from Pixhawk: roslaunch mavros px4.launch
  2. Launch realsense node, gather data from D435 camera: roslaunch realsense2_camera rs_camera.launch align_depth:=true
  3. Launch rtabmap's rgbd_odometry node: rosrun rtabmap_ros rgbd_odometry depth/image:=/camera/aligned_depth_to_color/image_raw rgb/image:=/camera/color/image_raw rgb/camera_info:=/camera/color/camera_info _frame_id:=camera_link this publishes /odom topic.
  4. Launch robot_localization node for fusing IMU and Visual Odometry data: roslaunch robot_localization ukf_template.launch - you need to edit ukf_tamplate.yaml located in your_catkin_ws/src/robot_localization/params. I hope you all understand the parameters there. If not, please refer to robot_localisation docs.
  5. Finally, launch rtabmap itself. Here, you can make various decisions, if you want to use rviz, or rtabmapviz etc. Please look at the rtabmap.launch file, if you need to configure final interface. roslaunch rtabmap_ros rtabmap.launch rtabmap_args:="--delete_db_on_start" depth_topic:=/camera/aligned_depth_to_color/image_raw rgb_topic:=/camera/color/image_raw camera_info_topic:=/camera/color/camera_info odom_topic:=/odometry/filtered visual_odometry:=false. odom_topic specifies, if you want to use visual odometry from robot_localisation, or just visual odometry, or whatever odometry you want to use. We set visual_odometry to false, because we've already launched it in step 3.

One final advice - look at all launch files, config files and set base_link to camera_link. So your base will be camera base. If you have any questions regarding to this solution, just ask me.

edit flag offensive delete link more

Comments

I am going through this method to generate a 3D map from rtabmap to get a better accuracy with IMU. I am using a raspberry pi4, relasense d435, and a 9DoF IMU. But the issue is when I am using the filtered odoemetry from the ukf, map is getting distorted with repeated information of the environment. When I run the rtabmap without the robot localization, map is generated without any distortion. rtabmap visual odometry node publishes odometry in a rate around 1Hz. IMU data is perfectly calibrated and fused using Madgwick filter for orientation estimation. So, according to my understanding this issue is because of the performance of the rpi. Can I solve this issue by tuning some parameters?

SahanGura gravatar image SahanGura  ( 2022-02-23 04:36:26 -0500 )edit
0

answered 2019-10-17 12:06:36 -0500

Hello, i try to do the same : D435 , Pixhawk , rtabmap, and use opensource_tracking.launch, just change /camera/imu with /mavros/imu/data , there no error but is like there isn't IMU ,How did you resolve the problem ? I did not understand your answer ,what change if you run in a separate commands ! ?

edit flag offensive delete link more
0

answered 2019-03-04 15:22:27 -0500

matlabbe gravatar image

updated 2019-03-04 15:25:14 -0500

In this launch file, rtabmap node seems connected to visual odometry output directly, ignoring the odometry output of robot_localization. I would break down rtabmap.launch to be able to disable tf published by visual odometry and to connect rtabmap to robot_localization output. So instead of:

<include file="$(find rtabmap_ros)/launch/rtabmap.launch">
    <arg name="args" value="--delete_db_on_start"/>
    <arg name="rgb_topic" value="/camera/color/image_raw"/>
    <arg name="depth_topic" value="/camera/aligned_depth_to_color/image_raw"/>
    <arg name="camera_info_topic" value="/camera/color/camera_info"/>
    <arg name="depth_camera_info_topic" value="/camera/depth/camera_info"/>
    <arg name="rtabmapviz" value="false"/>
    <arg name="rviz" value="true"/>
</include>

This would be:

<arg name="use_odom_topic" default="true"/>
<group ns="rtabmap">
  <node pkg="rtabmap_ros" type="rgbd_odometry" name="visual_odometry" output="screen">
    <remap from="rgb/image"       to="/camera/color/image_raw"/>
    <remap from="depth/image"     to="/camera/aligned_depth_to_color/image_raw"/>
    <remap from="rgb/camera_info" to="/camera/color/camera_info"/>

    <param name="frame_id"               type="string" value="camera_link"/>
    <param name="publish_tf"             type="bool"   value="false"/>
    <param name="publish_null_when_lost" type="bool"   value="false"/>
    <param name="approx_sync"            type="bool"   value="false"/> 
    <param name="Odom/ResetCountdown"    type="string" value="1"/>
  </node>

  <node name="rtabmap" pkg="rtabmap_ros" type="rtabmap" output="screen" args="--delete_db_on_start">
    <param name="frame_id"        type="string" value="camera_link"/>

    <remap from="rgb/image"       to="/camera/color/image_raw"/>
    <remap from="depth/image"     to="/camera/aligned_depth_to_color/image_raw"/>
    <remap from="rgb/camera_info" to="/camera/color/camera_info"/>

    <!-- robot_localization output --> 
    <remap if="$(arg use_odom_topic)" from="odom"  to="/odometry/filtered"/>   
    <!-- odom frame of robot_localization -->
    <param unless="$(arg use_odom_topic)" name="odom_frame_id" type="string" value="odom_combined"/> 
    <param unless="$(arg use_odom_topic)" name="odom_tf_angular_variance" type="double" value="0.005"/>
    <param unless="$(arg use_odom_topic)" name="odom_tf_linear_variance"  type="double" value="0.005"/> 
    <param unless="$(arg use_odom_topic)" name="approx_sync" type="bool"  value="false"/>    
  </node>
</group>

I added use_odom_topic argument to switch between topic or TF for odometry. When using TF, we can manually fix the odometry covariance for convenience, if the covariance generated by robot_localization doesn't work well with rtabmap. We can also do exact synchronization on image topics.

For the RVIZ config of rtabmap.launch:

   <node pkg="rviz" type="rviz" name="rviz" args="-d $(find rtabmap_ros)/launch/config/rgbd.rviz"/>
   <node pkg="nodelet" type="nodelet" name="points_xyzrgb" args="standalone rtabmap_ros/point_cloud_xyzrgb">
     <remap from="rgb/image"         to="/camera/color/image_raw/>
     <remap from="depth/image"       to="/camera/aligned_depth_to_color/image_raw"/>
     <remap from="rgb/camera_info"   to="/camera/color/camera_info"/>
     <remap from="cloud"             to="voxel_cloud" />

     <param name="decimation"  type="double" value="4"/>
     <param name="voxel_size"  type="double" value="0.0"/>
     <param name="approx_sync" type="bool"   value="false"/>
   </node>

cheers,
Mathieu

edit flag offensive delete link more

Comments

Hey,

It seems you are right. I had this in mind, that rtabmap is somehow fed by visual odometry only.

Now RQT_GRAPH looks like this

But still. If I visualize /odometry/filtered topic and /rtabmap/odom as arrows in rviz, they are same.

eMrazSVK gravatar image eMrazSVK  ( 2019-03-05 06:42:38 -0500 )edit

Look here You can see they're kind of overlapping. When I move the camera, they both change in the same way. When I move the Pixhawk, nothing changes.

eMrazSVK gravatar image eMrazSVK  ( 2019-03-05 06:44:56 -0500 )edit

Is ukf receiving IMU data? They may start overlapping, but they would drift apart after some distance traveled.

matlabbe gravatar image matlabbe  ( 2019-03-05 11:13:22 -0500 )edit

How to test, if ukf is receiving data?

I launched the nodes and created map of my room, then I have disconnected camera and tried to move the IMU and the odometry changed a bit, but it was delayed or so. I will do more testing and post it here. Is the odometry expected to change when I move IMU?

eMrazSVK gravatar image eMrazSVK  ( 2019-03-05 12:22:01 -0500 )edit

If the ukf uses IMU for orientation, the odometry should move almost exactly like imu for orientation. Look with rqt_graph if the topics are correctly connected, and do rostopic hz on your imu topic to know if it is published.

matlabbe gravatar image matlabbe  ( 2019-03-08 17:47:17 -0500 )edit

I am newbie and am looking to do a very similar thing using a different IMU and the Madgwick filter from the original launch file. Did this method work? Also, why couldn't you use the original launch file just telling the realsense to not use the camera imu and feed in your imu data to the filter?

Skrantzy gravatar image Skrantzy  ( 2019-03-25 18:17:01 -0500 )edit

Thanks for posting this. Just curious if you ever got this working in gazebo simulation? Also why use robot_localization when the FCU has it's own estimator? Thanks!

kiprock gravatar image kiprock  ( 2021-09-30 20:46:51 -0500 )edit

@kiprock - I never used Gazebo. I am currently using AirSim. To your second question - because I wanted to learn and adjust the estimator itself. I had Pixhawk on my desk and I just wanted to use ANY IMU, so I wanted to use the IMU from Pixhawk. That's all.

eMrazSVK gravatar image eMrazSVK  ( 2021-10-04 02:40:42 -0500 )edit

Question Tools

5 followers

Stats

Asked: 2019-03-04 06:26:33 -0500

Seen: 4,811 times

Last updated: Mar 08 '19