ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

jorgemia's profile - activity

2023-06-20 13:15:45 -0500 marked best answer How to structure project - Cmd vel Mux vs State machine?

I'm making an autonomous mobile robot which will have various features which can be activated with different buttons. For example, one button should trigger wall following whilst another button will trigger person following.

What is the standard approach used in ROS to achieve something like this? Should I use a cmd vel mux node which receives messages from a wall following node and person following node and outputs commands according to the button pressed? Or should I use a state machine which tells the robot when this button is pressed go into the "wall following" state?

2022-11-11 08:44:12 -0500 commented question Is it possible to use multiple inflation layers in a local costmap, that is, one inflation per sensor?

Were you able to get this working in the end?

2022-07-19 06:22:52 -0500 commented answer ros2 launch using IncludeLaunchDescription and remapping topics

Thanks @shonigmann I found an exact example of what I was trying to do here: https://github.com/gnaur/simbot/blob/4d9bb0

2022-07-18 05:00:14 -0500 commented answer ros2 launch using IncludeLaunchDescription and remapping topics

Is there any more guidance around this? Some more documentation would be great. Struggling to remap a topic from another

2021-09-14 10:04:28 -0500 received badge  Famous Question (source)
2021-08-17 17:03:40 -0500 received badge  Great Question (source)
2021-05-19 10:41:07 -0500 received badge  Notable Question (source)
2021-05-13 04:30:33 -0500 received badge  Famous Question (source)
2021-04-23 11:57:21 -0500 received badge  Notable Question (source)
2021-04-15 07:33:10 -0500 received badge  Popular Question (source)
2021-04-15 04:54:45 -0500 commented answer Change subscribed topic/subscriber based on condition during runtime

I am going to try and pass the nodehandle as a parameter to the service using boost::bind and see if that works: This i

2021-04-14 14:08:51 -0500 commented answer Change subscribed topic/subscriber based on condition during runtime

Thanks for the answer. I already have a node handle that initialises some publishers/subscribers and gets parameters in

2021-04-13 10:33:47 -0500 edited question Change subscribed topic/subscriber based on condition during runtime

Change subscribed topic/subscriber based on condition during runtime I have a robot with a front and back camera. I've g

2021-04-13 10:32:06 -0500 edited question Change subscribed topic/subscriber based on condition during runtime

Change subscribed topic/subscriber based on condition during runtime I have a robot with a front and back camera. I've g

2021-04-13 10:30:57 -0500 asked a question Change subscribed topic/subscriber based on condition during runtime

Change subscribed topic/subscriber based on condition during runtime I have a robot with a front and back camera. I've g

2021-04-06 03:22:02 -0500 received badge  Famous Question (source)
2021-04-06 03:22:02 -0500 received badge  Notable Question (source)
2021-03-11 05:55:38 -0500 commented question How should I add semantic segmentation data to my costmap?

Did you end up implementing the semantic point clouds into your costmaps somehow? I saw on your gudrun repo it seems you

2021-03-05 09:22:49 -0500 received badge  Famous Question (source)
2021-02-19 07:23:36 -0500 received badge  Famous Question (source)
2021-01-10 15:53:35 -0500 received badge  Good Question (source)
2021-01-10 15:53:00 -0500 received badge  Famous Question (source)
2020-12-24 23:51:23 -0500 received badge  Favorite Question (source)
2020-12-24 23:44:58 -0500 received badge  Nice Question (source)
2020-12-09 07:35:14 -0500 received badge  Notable Question (source)
2020-12-04 09:26:39 -0500 commented answer What is ConstPtr&?

I wrote this short program to try and understand how it all works: // Example program #include <iostream> struct

2020-12-04 06:52:30 -0500 commented answer What is ConstPtr&?

I wrote this short program to try and understand how it all works: // Example program #include <iostream> struct

2020-12-04 06:50:58 -0500 commented answer What is ConstPtr&?

I wrote this short program to try and understand how it all works: // Example program #include <iostream> struct

2020-12-03 14:36:52 -0500 commented answer What is ConstPtr&?

Thanks for the detailed answer! Very useful! Yes, I just spent the last hour learning about pointers and references but

2020-12-03 13:38:46 -0500 commented answer What is ConstPtr&?

In someone's code I've seen: void odomCallback(const nav_msgs::Odometry::ConstPtr &msg) Does it matter if the &am

2020-12-03 13:38:10 -0500 commented answer What is ConstPtr&?

In someone's code I've seen: void odomCallback(const nav_msgs::Odometry::ConstPtr &msg) Does it matter if the &am

2020-12-03 08:09:01 -0500 commented answer How to run 2 instances of robot_localization to compare them in Rviz?

But how would I be able to visually compare the performance of the two robot localization instances? Don't they have to

2020-12-03 08:00:25 -0500 received badge  Notable Question (source)
2020-12-02 17:28:33 -0500 commented answer How to run 2 instances of robot_localization to compare them in Rviz?

Thanks for the answer but by doing that my first instance is going to provide the base_link to odom transform whilst the

2020-12-02 17:23:07 -0500 received badge  Popular Question (source)
2020-12-02 03:14:17 -0500 asked a question How to run 2 instances of robot_localization to compare them in Rviz?

How to run 2 instances of robot_localization to compare them in Rviz? I've got two instances of robot_localization with

2020-11-23 09:27:52 -0500 commented question help with Rtabmap_ros Odometry

I think you might be able to choose the algorithm that is being used eg. GFTT, SURF, SIFT, FAST, ORB... to achieve bette

2020-11-23 09:22:20 -0500 received badge  Popular Question (source)
2020-11-20 03:29:13 -0500 asked a question Any good navigation package/path planner for outdoor navigation on 3D/2d-manifold uneven terrain with a skid steer robot?

Any good navigation package/path planner for outdoor navigation on 3D/2d-manifold uneven terrain with a skid steer robot

2020-11-17 03:14:09 -0500 commented question move_base in uneven terrain (slopes)

@stfn @Kiwa21 did you find any path planners/navigation solutions for outdoor rough terrain?

2020-11-16 06:27:59 -0500 commented answer Navigation2 support for 3D navigation (Drones)

@stevemacenski does navigation 2 already support navigation/path planning in outdoor/3D terrain for wheeled ground robot

2020-11-05 13:31:22 -0500 received badge  Famous Question (source)
2020-11-03 12:12:36 -0500 received badge  Popular Question (source)
2020-11-03 11:08:45 -0500 commented answer How to set up robot_localization config when using differential: true

Ok, I think that makes sense because you're basically telling r_l which values to use for the differential parameter rig

2020-11-02 10:35:22 -0500 received badge  Notable Question (source)
2020-11-02 09:03:39 -0500 asked a question How to set up robot_localization config when using differential: true

How to set up robot_localization config when using differential: true I was reading the robot_localization docs and it s

2020-10-23 05:00:12 -0500 commented question Robot gets problem when the goal is set behind

This question might be useful #210914. From one of the answers, "DWA cannot perform a rotate in spot motion. It will alw

2020-10-23 04:47:18 -0500 marked best answer Jetson Nano comes with OpenCV 4.1.1., do I need to downgrade to 3.2. for melodic?

I just got a Jetson Nano running Ubuntu 18.04 and it comes with OpenCV 4.1.1. pre installed. I've read ROS melodic is meant to work with OpenCV 3.2. and I'm getting some catkin make errors due to conflict between versions, for example:

usr/bin/ld: warning: libopencv_imgcodecs.so.3.2, needed by /opt/ros/melodic/lib/libcv_bridge.so, may conflict with libopencv_imgcodecs.so.4.1

Should I downgrade my system OpenCV to 3.2.?

2020-10-23 04:46:29 -0500 marked best answer diff drive controller giving wrong odometry data (radius and separation multipliers)

I'm working on a 4WD skid steer robot which uses ros control and the diff drive controller.

If I set my wheel radius multiplier to 1.0, the robot seems to go extremely slow (doesn't seem to be moving at the speed I tell it to) and the odometry data seems off. When checking the odometry topic published by the diff drive controller, if I move the robot forward 60cm (checked with a ruler), the controller thinks the robot has moved over 2m...

If I change my radius multiplier to 0.195 the position data of the odometry seems more accurate, reflecting my real life measurements. With this multiplier however, when I go at very high speed and look at the odom TF in Rviz the robot starts going backwards suddenly (in Rviz and in the odom topic as well but the actual robot is moving forward). This doesn't happen when the multiplier is set to 1.0 though.

What could possibly be happening? I understand that I have to play with the wheel separation multiplier because of the skid steer nature of the robot vs differential style, but the radius multiplier should stay at 1? I have checked my urdf and hardware interface and don't think they're the issue but anything I should look for?

Here's my diff drive config file:

robot_joint_publisher:
  type: "joint_state_controller/JointStateController"
  publish_rate: 100

robot_velocity_controller:
  type: "diff_drive_controller/DiffDriveController"
  left_wheel: ['front_left_wheel', 'rear_left_wheel']
  right_wheel: ['front_right_wheel', 'rear_right_wheel']
  publish_rate: 100
  pose_covariance_diagonal: [0.001, 0.001, 0.001, 0.001, 0.001, 0.03] #Need to define?
  twist_covariance_diagonal: [0.001, 0.001, 0.001, 0.001, 0.001, 0.03] #Need to define?
  cmd_vel_timeout: 0.25
  velocity_rolling_window_size: 2

  wheel_separation : 0.42 #Distance between left and right
  wheel_radius : 0.1651

  # Base frame_id
  base_frame_id: base_link

  # Odometry fused with IMU is published by robot_localization, so
  # no need to publish a TF based on encoders alone.
  enable_odom_tf: true

  # Navvy hardware provides wheel velocities
  estimate_velocity_from_position: false

  # Wheel separation and radius multipliers
  wheel_separation_multiplier: 2.0 # husky 1.875 - Will need to adjust based on odom readings
  wheel_radius_multiplier    : 0.195 # default: 1.0

  # Velocity and acceleration limits - To define
  # Whenever a min_* is unspecified, default to -max_*
  linear:
    x:
      has_velocity_limits    : true
      max_velocity           : 3.0   # m/s
      has_acceleration_limits: true
      max_acceleration       : 3.0   # m/s^2
  angular:
    z:
      has_velocity_limits    : true
      max_velocity           : 9.0   # rad/s
      has_acceleration_limits: true
      max_acceleration       : 6.0   # rad/s^2
2020-10-23 04:46:29 -0500 received badge  Scholar (source)