Ask Your Question

eMrazSVK's profile - activity

2021-10-04 02:40:42 -0600 commented answer RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

@kiprock - I never used Gazebo. I am currently using AirSim. To your second question - because I wanted to learn and adj

2020-12-27 08:53:21 -0600 received badge  Favorite Question (source)
2020-09-25 04:04:13 -0600 received badge  Famous Question (source)
2020-09-02 02:15:25 -0600 received badge  Notable Question (source)
2020-08-29 02:27:41 -0600 received badge  Popular Question (source)
2020-08-27 05:51:57 -0600 asked a question RTAB-Map remote mapping with RealSense T265 + D435

RTAB-Map remote mapping with RealSense T265 + D435 Hello, I think title of my question is self-explanatory. Of course

2020-04-27 14:33:16 -0600 received badge  Nice Question (source)
2019-11-15 09:49:24 -0600 received badge  Famous Question (source)
2019-10-22 21:31:07 -0600 received badge  Notable Question (source)
2019-10-22 09:02:34 -0600 commented question Octomap_server + RealSense D435 + RealSense T265

No, it's not available anywhere for now.

2019-10-22 03:12:36 -0600 marked best answer Octomap_server + RealSense D435 + RealSense T265

Hello everyone,

I would like to create simple octomap with use of both RealSense D435 and T265 cameras. I am using my own RealSense wrapper (do not ask why please, it is needed in this case). I can publish PointCloud from D435 camera (depth stream) and pose (odometry) from T265 as a topic.

What I am currently struggling with is publishing tf. I know, that octomap_server can incrementally build occupancy grid based on odometry received from tf. I just cannot figure out, how to setup frame_id's and so.

This is how I broadcast transform:

tf_br.sendTransform(tf::StampedTransform(transform, ros::Time::now(), "tf_t265", "map"));

The transform object is filled with the exact same data as I am using in the topic for publishing T265 pose (which is being used and it works).

This is what the part of my launchfile for octomap_server looks like:

<node pkg="octomap_server" type="octomap_server_node" name="octomap_server">
        <param name="resolution" value="0.05" />

        <!-- fixed map frame (set to 'map' if SLAM or localization running!) -->
        <param name="frame_id" type="string" value="tf_t265" />

        <!-- maximum range to integrate (speedup!) -->
        <param name="sensor_model/max_range" value="5.0" />

        <!-- data source to integrate (PointCloud2) -->
        <remap from="cloud_in" to="/d435/depth" />      
</node>

I suspect, that frame_id param is the key here, but I tried to set it to map and it does not work too. I know, that maybe I am publishing the transform incorrectly, but this is why I am asking for help. There is nothing else in my code related with tf broadcasting.

If any additional info is needed, please let me know. Thank you for any help!

Have a nice day.

2019-10-22 03:12:29 -0600 answered a question Octomap_server + RealSense D435 + RealSense T265

I'm answering my own question. The solution was to add transform between pointcloud_frame_id and map frame (this fact is

2019-10-22 03:12:29 -0600 received badge  Rapid Responder (source)
2019-10-22 02:54:27 -0600 commented answer Octomap_server + RealSense D435 + RealSense T265

Thanks, maybe that's the case too, but what was missing was this: PointCloud has also its own frame and transform from p

2019-10-22 02:51:19 -0600 commented question Octomap_server + RealSense D435 + RealSense T265

Because I need only basic functionality from librealsense inside ROS. Furthermore I am developing application on Nvidia

2019-10-22 02:44:28 -0600 received badge  Popular Question (source)
2019-10-21 11:55:28 -0600 received badge  Student (source)
2019-10-21 03:21:15 -0600 asked a question Octomap_server + RealSense D435 + RealSense T265

Octomap_server + RealSense D435 + RealSense T265 Hello everyone, I would like to create simple octomap with use of bot

2019-06-07 20:06:22 -0600 received badge  Famous Question (source)
2019-06-07 20:06:22 -0600 received badge  Notable Question (source)
2019-05-29 04:11:23 -0600 received badge  Popular Question (source)
2019-05-28 09:24:35 -0600 edited question Cannot import custom python module in custom python pkg

Cannot import custom python module in custom python pkg Hello everyone, I followed documentation for creating ROS pack

2019-05-28 09:24:35 -0600 received badge  Editor (source)
2019-05-28 05:51:01 -0600 asked a question Cannot import custom python module in custom python pkg

Cannot import custom python module in custom python pkg Hello everyone, I followed documentation for creating ROS pack

2019-04-12 20:53:07 -0600 received badge  Famous Question (source)
2019-04-02 10:39:03 -0600 received badge  Notable Question (source)
2019-03-24 23:59:26 -0600 received badge  Popular Question (source)
2019-03-18 07:17:03 -0600 asked a question Use RTAB-Map with ORB_SLAM2 in ROS

Use RTAB-Map with ORB_SLAM2 in ROS Hello everyone, I successfully compiled RTAB-Map with Orb-Slam2, so I can now choos

2019-03-15 08:40:15 -0600 marked best answer RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

Hello everyone,

I am trying to fuse a visual odometry with Pixhawk IMU. For those who don't know, Pixhawk is an autopilot used for drone in this case. There is ROS integration provided thanks to mavros package, so I can get standard imu message from the Pixhawk. I am using only the IMU data from Pixhawk. I am using ROS Kinetic.

Intel already has a camera with built-in IMU, which is D435i. They provided example of launch file with exact same config as I want to use (realsense, rtabmap, robot_localization). Only difference is the IMU used. The launch file is here: opensource_tracking.launch

Since I am using different IMU, I changed the code a little bit: look here I launch node for gathering data from Pixhawk elsewhere, so it cannot be found in the launch file above.

So to the problem. I am able to run the rtabmap, create the 3D map of environment, localize myself with the visual odometry itself. I am able to get the imu data from pixhawk, when I echo the pixhawk imu data topic. Everything runs without errors. Although I am not sure, whether the config is correct. I attached the Pixhawk and RealSense camera to a plastic plate. I thought it was operating correctly, but when I detach Pixhawk and move it around, odometry does not change (I visualize odometry in RViz). The odom topic I try to visualize is /odometry/filtered, which is output of robot_localization package, so I guess this odometry should be output of merged imu and visual odometry. So to sum up: I am not sure if the IMU affects the odometry at all. I attach rqt_graph and tf view_frames for better understanding what is happening.

P.S. I may have forgotten to write everything about this issue, so please ask for additional info, if it is needed. I really appreciate any help! Thanks RQT_GRAPH

TF VIEW_FRAMES

2019-03-15 08:40:15 -0600 received badge  Scholar (source)
2019-03-12 02:13:57 -0600 received badge  Enthusiast
2019-03-12 02:13:57 -0600 received badge  Enthusiast
2019-03-11 03:35:20 -0600 received badge  Famous Question (source)
2019-03-08 07:36:46 -0600 answered a question RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

I am answering my own question. So far, I guess it is working, let's say in principle. It does not work how I, or anybo

2019-03-05 12:28:32 -0600 received badge  Notable Question (source)
2019-03-05 12:25:28 -0600 commented question RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

Thanks, done.

2019-03-05 12:24:55 -0600 edited question RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization Hello everyone, I am trying to fuse a visual odometry wit

2019-03-05 12:22:01 -0600 commented answer RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

How to test, if ukf is receiving data? I launched the nodes and created map of my room, then I have disconnected camer

2019-03-05 08:05:02 -0600 commented answer RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

Look here You can see they're kind of overlapping. When I move the camera, they both change in the same way. When I move

2019-03-05 08:05:02 -0600 commented answer RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

Hey, It seems you are right. I had this in mind, that rtabmap is somehow fed by visual odometry only. Now RQT_GRAPH

2019-03-05 06:29:37 -0600 received badge  Supporter (source)
2019-03-05 06:06:18 -0600 received badge  Popular Question (source)
2019-03-04 06:31:07 -0600 asked a question RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization

RealSense D435 + rtabmap + Pixhawk (IMU) + robot_localization Hello everyone, I am trying to fuse a visual odometry wit