ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

psammut's profile - activity

2022-10-25 18:58:36 -0500 edited answer declare nested parameter

I have a similar problem and opted to import these from a settings json defined by a thrift file. I can iterate over the

2022-10-25 18:58:13 -0500 answered a question declare nested parameter

I have a similar problem and opted to import these from a settings json defined by a thrift file. I can iterate over the

2022-08-09 16:55:23 -0500 received badge  Critic (source)
2022-06-28 18:41:46 -0500 commented answer Colcon test not running gtests

I just lost 2 hours of my life because of this. How did you learn about this?

2022-06-01 19:39:08 -0500 commented answer Linking OpenCV library (ROS2, C++)

How did you know to add OpenCV? Are there docs on this?

2022-05-12 18:53:43 -0500 commented question ROS2 static compile

Did the proposed answer by Dirk Thomas work for you?

2022-05-12 18:53:10 -0500 answered a question ROS2 static compile

Hi! Where you able to do this?

2021-10-12 17:58:21 -0500 marked best answer rtabmap issue with loop closure

Hello, I have an rtabmap setup with:

  • Stereo camera for mapping with rtab
  • Wheel odometry only

In simulation and also real life I am getting incorrect loop closure that seem to mess up a good map. Is there any obvious stuff I'm doing wrong here?

Here is my rtabmap launch file:

    <node name="rtabmap" pkg="rtabmap_ros" type="rtabmap" output="screen" 
        args="--delete_db_on_start">


    <remap from="left/image_rect"   to="/stereo1/left/image_rect_color"/>
    <remap from="right/image_rect"  to="/stereo1/right/image_rect"/>
    <remap from="left/camera_info"  to="/stereo1/left/camera_info"/>
    <remap from="right/camera_info" to="/stereo1/right/camera_info"/>

    <param name="subscribe_stereo"      type="bool"     value="true"/>
    <param name="subscribe_depth"       type="bool"     value="false"/>
    <param name="approx_sync"           type="bool"     value="true"/>
    <param name="frame_id"              type="string"   value="base_link"/>
    <param name="grid_size"             type="double"   value="100"/>

    <param name="queue_size"            type="int"      value="30"/>

    <!-- all points below 20 cm are ground -->
    <param name="Grid/MaxGroundHeight"  type="string"   value="0.5"/>       

    <!-- all points above 20 cm and below 2 m are obstacles -->
    <!-- <param name="Grid/MaxObstacleHeight" type="string" value="3"/>        -->

    <!-- Use simple passthrough to filter the ground instead of normals segmentation -->
    <!-- <param name="Grid/NormalsSegmentation" type="string" value="false"/>  -->

    <!-- RTAB-Map's parameters -->
    <param name="Rtabmap/TimeThr"                   type="string"   value="700"/>
    <param name="Rtabmap/DetectionRate"             type="string"   value="1"/>
    <param name="Kp/MaxFeatures"                    type="string"   value="200"/>
    <param name="Kp/RoiRatios"                      type="string"   value="0.03 0.03 0.04 0.04"/>
    <param name="Kp/DetectorStrategy"               type="string"   value="0"/>
    <!-- use SURF -->
    <param name="Kp/NNStrategy"                     type="string"   value="1"/>
    <!-- kdTree -->
    <param name="SURF/HessianThreshold"             type="string"   value="1000"/>
    <!-- 3D->2D (PnP) -->
    <param name="RGBD/OptimizeFromGraphEnd"         type="string"   value="true"/>
    <param name="RGBD/LoopClosureReextractFeatures" type="string"   value="true"/>
    <param name="Grid/MaxGroundHeight"              type="double"    value="0.18"/>

    <param name="Vis/MinInliers"                    type="string"   value="10"/>
    <param name="Vis/InlierDistance"                type="string"   value="0.1"/>
    <param name="Viz/EstimationType"                type="int"      value="1"/>
    <param name="Vis/MaxFeatures"                   type="int"      value="1000"/>

    <!-- http://bit.ly/2wdXVtj -->

    <!-- Adding a despeckler -->
    <param name="Stereo/WinWidth"                   type="int"      value="15"   />
    <param name="Stereo/WinHeight"                  type="int"      value="3"    />
    <param name="Stereo/Iterations"                 type="int"      value="30"   />
    <param name="Stereo/MaxLevel"                   type="int"      value="3"    />
    <param name="Stereo/MinDisparity"               type="int"      value="1"    />
    <param name="Stereo/MaxDisparity"               type="int"      value="128"   />
    <param name="Stereo/OpticalFlow"                type="bool"     value="False" />
    <param name="Stereo/SSD"                        type="bool"     value="true" />
    <param name="Stereo/Eps"                        type="double"   value="0.01" />

    <param name="StereoBM/BlockSize"                type="int"      value="15"  /> 
    <param name="StereoBM/MinDisparity"             type="int"      value="0"   />
    <param name="StereoBM/NumDisparities"           type="int"      value="128"  />
    <param name="StereoBM/PreFilterSize"            type="int"      value="9"   />
    <param name="StereoBM/PreFilterCap"             type="int"      value="31"  />
    <param name="StereoBM/UniquenessRatio"          type="int"      value="15"  />
    <param name="StereoBM/TextureThreshold"         type="int"      value="2500"  />
    <param name="StereoBM/SpeckleWindowSize"        type="int"      value="1000" />
    <param name="StereoBM/SpeckleRange"             type="int"      value="4"   />

A video of the problem: https://youtu.be/tjO-dYugKbw

PS: This might be a simple issue having to do with my simulation cause the ground is very ... (more)

2021-01-13 05:59:25 -0500 received badge  Nice Answer (source)
2020-05-21 04:04:07 -0500 received badge  Enlightened (source)
2020-05-21 04:04:07 -0500 received badge  Good Answer (source)
2019-08-07 03:44:09 -0500 received badge  Famous Question (source)
2019-05-20 01:09:16 -0500 marked best answer Reverse recovery behavior for stuck non circular robots

Hello, I have a robot that is having issues getting stuck. In a lot of situations, a simple reverse operation would un stick it. Rotate recovery doesn't work well because my robot's turning center is offset from its body. Before I get to writing one from scratch, is there any reversing type recovery behavior i could get move_base to use?

image description

Thanks!

2019-04-12 14:13:24 -0500 received badge  Nice Question (source)
2019-03-13 08:24:48 -0500 marked best answer RTABMAP with two stereo cameras

Hello, first of all thanks to Mathieu Labbe for all his hard work on rtabmap and for bringing it to ROS!

I would like to use two stereo cameras to aid in SLAM and also path planning. I have been experimenting with one camera and I found that if I point it forward, SLAM works well but the path planner doesn't because it is guessing about obstacles at the feet of the robot. If I point the camera down path planning works well but mapping suffers.

Id like to use two stereo cameras increase path planner obstacle detection accuracy and maintain good map building abilities.

image description

Question is, what is the best way of doing this? rtabmap seems to be written to subscribe to only one stereo camera. Should I combine the point clouds generated by stereo_image_proc?

Also I'm unsure as to how I should use my wheel odometry in this setup. It looks like rtab can only use one odometry source. Should I just do visual odometry if it works well enough and pipe that into rtab? I'm experimenting right now with wheel odometry data and one stereo camera with rtab and I'm slightly unsure on how I should connect these systems together.

Thanks!

2019-03-01 12:45:35 -0500 marked best answer robot_localization IMU coordinate frame.

My question is: When using the robot_localization package, should the IMU chirality obey the standard right hand rule, or left hand rule? I think there is a discrepancy in the docs.

From the docs of the robot_localization package a section about IMU data states the following:

  • Adherence to specifications: As with odometry, be sure your data adheres to REP-103 and the sensor_msgs/Imu specification.
  • Acceleration: Be careful with acceleration data. The state estimation nodes in robot_localization assume that an IMU that is placed in its neutral right-side-up position on a flat surface will:

    • Measure +9.81 meters per second squared for the Z axis.
    • If the sensor is rolled +90 degrees (left side up), the acceleration should be +9.81 meters per second squared for the Y axis.
    • If the sensor is pitched +90 degrees (front side down), it should read -9.81 meters per second squared for the X axis.

Problem: this documentation says two things. It says the data should adhere to REP-103, which says "All systems are right handed. This means they comply with the right hand rule.". Then it describes an example that could only work on a left hand system. Am I missing something?image description

2018-10-25 04:43:43 -0500 received badge  Nice Answer (source)
2018-08-29 04:50:40 -0500 received badge  Nice Answer (source)
2018-06-26 16:39:18 -0500 received badge  Famous Question (source)
2018-05-31 11:37:09 -0500 answered a question white balance problem for USB camera

This is a camera problem and not a ros driver issue. You can confirm this by using a dedicated viewer such as guvcview o

2018-05-30 15:45:01 -0500 commented question white balance problem for USB camera

Hi! first step when messing is to make sure that the issue lays with ROS/libuvc driver and not the camera. To do this,

2018-04-16 19:42:59 -0500 commented question How to install opencv-3.3.1 instead of opencv-3.3.1-dev kinetic

I'm also running into this issue. All my source that uses opencv3 is no longer compiling!

2018-04-07 15:09:48 -0500 received badge  Famous Question (source)
2018-04-05 07:45:03 -0500 received badge  Notable Question (source)
2018-01-24 10:19:32 -0500 commented answer Obstacles remain in costmap despite observation_persistence being set

@Felix Widmaier did you ever get this work for you? Mine still leaves fantom obstacles. :(

2018-01-24 04:06:47 -0500 received badge  Famous Question (source)
2018-01-23 19:49:01 -0500 commented answer Obstacles remain in costmap despite observation_persistence being set

oh man this is this awesome. Thanks David!!!! Lu!

2018-01-22 17:06:18 -0500 commented question Meaning of costmap_2d observation_persistence parameter?

Yup. I also have the same exact question. This parameter is broken. Setting it to 0 does not only keep keep the most rec

2017-11-25 00:46:31 -0500 received badge  Notable Question (source)
2017-11-07 04:21:08 -0500 received badge  Popular Question (source)
2017-11-06 16:14:32 -0500 edited question robot_localization IMU coordinate frame.

robot_localization IMU coordinate frame. My question is: When using the robot_localization package, should the IMU chira

2017-11-06 16:13:59 -0500 asked a question robot_localization IMU coordinate frame.

robot_localization IMU coordinate frame. My question is: Should the IMU chirality obey the standard right hand rule, or

2017-11-02 09:55:55 -0500 received badge  Notable Question (source)
2017-11-01 06:14:10 -0500 received badge  Nice Answer (source)
2017-10-31 20:59:33 -0500 commented question Is this an accurate explanation of a covariance matrix?

I found it helpful 5 years later :)

2017-10-31 09:54:18 -0500 commented question Linux configuration required to run ros extensively

Are you going to be using a virtual machine or real install? A real install is recommended because Rviz and Gazebo will

2017-10-30 08:29:25 -0500 commented answer Export matlab monocular camera calibration parameters to ROS

how do you calculate the projection_matrix from rectification matrix? I don't know :( You should really look into doing

2017-10-30 08:28:37 -0500 commented answer Export matlab monocular camera calibration parameters to ROS

camera_name is simply the frame_id of your camera if you defined it in the URDF file of your robot. This allows you to a

2017-10-28 15:31:40 -0500 edited answer Export matlab monocular camera calibration parameters to ROS

A matlab monocular camera calibration is stored in the format of a cameraParameters object. In ROS, the calibration pa

2017-10-28 12:50:39 -0500 commented question Steps-How to Start with Navigation Stack

I highly recommend these tutorials on youtube: https://www.youtube.com/watch?v=8ckSl4MbZLg&list=PLTEmcYHVE7dPWixFnzk

2017-10-28 12:47:41 -0500 answered a question Debugging ROS nodes in c++

What you can do is download the source of the packages in question, put them in your catkin workspace and compile them,

2017-10-28 12:39:07 -0500 answered a question Export matlab monocular camera calibration parameters to ROS

A matlab monocular camera calibration is stored in the format of a cameraParameters object. In ROS, the calibration pa

2017-10-27 13:17:51 -0500 commented question Export matlab monocular camera calibration parameters to ROS

Can you post a sample .mat cal file? A .yaml camera calibration file is nothing special, and simply contains the same in

2017-10-27 13:17:36 -0500 commented question Export matlab monocular camera calibration parameters to ROS

Can you post a sample .mat cal file? A .yaml camera calibration file is nothing special, and simply contains the same in

2017-10-27 13:17:25 -0500 commented question Export matlab monocular camera calibration parameters to ROS

Can you post a sample .mat file? A .yaml camera calibration file is nothing special, and simply contains the same inform