ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

triantatwo's profile - activity

2022-08-04 06:58:09 -0500 received badge  Stellar Question (source)
2021-10-04 13:38:18 -0500 marked best answer Can I localize with imprecise/noisy sonar sensor?

I have a small differential drive robot with a single sonar sensor mounted on the front of it. I also have an IMU and GPS on the vehicle. My goal is to navigate from one point to another in an unknown environment and avoid obstacles along the way.

At the moment we're testing indoors, so the GPS is out of the picture. I want to be able to send a waypoint via rviz (simulating a GPS coordinate waypoint) and have the vehicle navigate there.

I don't believe mapping is necessary for my application, however I think localization is - or else how would it know it arrived at its destination? I plan on using robot localization to combine my IMU and odometry sensors and create a pose estimate. But won't that estimate drift over time?

Does ROS have any packages that will enable my robot to correct for odometry drift and adjust itself in the map frame, using sonar data? What I've seen used so far is support for LIDAR and LaserScan messages along with gmapping or hector_slam, but I don't have that hardware.

2021-04-06 07:21:21 -0500 received badge  Good Question (source)
2019-07-05 10:10:07 -0500 received badge  Nice Question (source)
2019-06-27 09:06:13 -0500 marked best answer range_sensor_layer marks max sonar range as obstacle?

I'm using Gazebo to simulate a 4-wheeled differential drive robot. The robot has a forward sonar sensor, so I added a simulated sonar sensor.

The sonar sensor appears to work; it detects obstacles and the Range messages look correct (e.g. max range value when no obstacles). The visualization in Gazebo also looks correct.

I use the range_sensor_layer package as a plugin for costmap_2d. My issue is that for some reason, when there is no obstacle and the sonar sensor is at max range, the cost map registers an obstacle.

Below is a screenshot of Gazebo (left), Rviz (top right), and the echo'd Range message (bottom right). I rotated the vehicle in a circle without any obstacles, yet the costmap shows that the robot is surrounded by obstacles.

image description

Now there's a parameter in range_sensor_layer called clear_on_max_reading, which will remove obstacles when a reading is max. However I've found that this does more harm than good, because it will clear away actual obstacles by accident.

For example, when it's navigating it runs along the wall and starts creating a wall of obstacles. Eventually it decides to turn, and as the range value maxes out, it clears a whole chunk of actual obstacle. Now there's a hole in the wall, so it tries to navigate towards the hole, and relearns that it's indeed an obstacle. This repeats forever. It's both funny and infuriating.

Here are the YAML files I'm using for my costmap:

costmap_common_params.yaml

map_type: costmap
origin_z: 0.0
z_resolution: 1
z_voxels: 2

obstacle_range: 0.5
raytrace_range: 0.5

footprint: [[-0.21, -0.165], [-0.21, 0.165], [0.21, 0.165], [0.21, -0.165]]
footprint_padding: 0.1

plugins:
- {name: sonar_layer, type: "range_sensor_layer::RangeSensorLayer"}
- {name: inflater_layer, type: "costmap_2d::InflationLayer"}

sonar_layer:
  ns: ""
  topics: ["/sonar"]
  no_readings_timeout: 1.0
  clear_threshold: 0.2
  mark_threshold: 0.80
  clear_on_max_reading: false

inflater_layer:
 inflation_radius: 0.3

local_costmap_params.yaml

local_costmap:
   global_frame: odom
   robot_base_frame: base_link
   update_frequency: 20.0
   publish_frequency: 5.0
   width: 10.0
   height: 10.0
   resolution: 0.05
   static_map: false
   rolling_window: true

global_costmap_params.yaml

global_costmap:
  global_frame: odom
  robot_base_frame: base_link
  update_frequency: 20
  publish_frequency: 5
  width: 40.0
  height: 40.0
  resolution: 0.05
  origin_x: -20.0
  origin_y: -20.0
  static_map: true
  rolling_window: false

In my robot URDF I include the sonar_sensor macro and instantiate my sonar sensor like so:

<xacro:sonar_sensor name="sonar" parent="front_mount" ros_topic="sonar" update_rate="10" min_range="0.15" max_range="1.5" field_of_view="${10*PI/180}" ray_count="3" visualize="true">
  <origin xyz="0.0 0 0.05" rpy="0 0 0"/>
</xacro:sonar_sensor>

I'm not sure what's going on here. I'd appreciate any help.

2019-05-21 04:32:46 -0500 received badge  Great Question (source)
2019-05-10 20:28:20 -0500 received badge  Favorite Question (source)
2019-01-01 19:14:38 -0500 received badge  Favorite Question (source)
2018-12-30 00:52:47 -0500 marked best answer Confused about coordinate frames. Can someone please explain?

I've read:

However I'm still quite confused about the different coordinate frames map, odom, and base_link. Below I'll explain what my current understanding of these frames is, along with some questions.

map

  • I think this one makes the most sense conceptually.
  • I consider this the "ground truth" which is the real world.
  • REP-0105 calls this a "world fixed frame", which makes sense, because it's tied to the real world.

odom

  • REP-0105 calls this a "world fixed frame". What? How is this fixed? Isn't this supposed to represent the movement of a mobile robot?
  • This frame is computed from an odemetry source like wheels or an IMU. Makes sense.

base_link

  • Why is base_link a separate frame? Shouldn't odom and base_link be the same thing? Doesn't odom represent the robot base and its movement?

I feel like I partially understand frames and tf. For example if you have sonar sensors mounted on a robot, you may want to know what the sensor value is from the base_link frame instead of the sonar sensor frame. I get that.

But the odom and base_link frames confuse me. Can someone please explain what the difference is?

2018-10-04 11:13:28 -0500 marked best answer octomap, slam, path planning: how does it all fit together?

My goal is to simulate the flight of a UAV in an outdoor environment. My preliminary steps are to see what ROS packages exist which I can reuse, and determine what I need to write myself.

I've looked for projects which have overlap with mine:

However I'm still trying to figure out how the following topics fit together:

  • SLAM
  • OctoMap
  • Path Planning

I'm not sure how they're intertwined, so I have several questions for experts out there:

OctoMap

  • Is it only good for generating 3D probabilistic maps? Does it offer solutions for localization? Or is that out of scope?
  • Is it capable of efficiently representing large geometric areas as obstacles? For example if I have a house I want to avoid, can I place a large rectangular prism in the map?
  • Does it have support for dynamic obstacles? For example other drones or moving vehicles?
  • Are octomaps even necessary for mobile drone applications (or outdoor applications in general)? Do people roll their own mapping solutions or does everyone use octomap?
  • Is there a certain data type (or sensor type) that the octomap supports or requires? What if I have sonar, RGBD, LIDAR, monocular camera, etc.?
  • What if an octomap gets too big? Is it possible to have a moving window filter around an area of interest (i.e. the drone) and "forget" the rest of the map (or dump to disk)?

Path Planning

  • Can I view path planning as a function which takes in an octomap and returns a 3D path according to the planner algorithm?

SLAM

  • Does a SLAM algorithm populate the octomap? Or does the SLAM algorithm work off the existing octomap and sensor data? What does this workflow look like?
2018-10-04 11:13:25 -0500 received badge  Good Question (source)