Ask Your Question

nyquist09's profile - activity

2019-06-14 04:50:46 -0500 received badge  Nice Answer (source)
2017-02-17 21:12:47 -0500 received badge  Famous Question (source)
2016-06-23 11:59:45 -0500 received badge  Popular Question (source)
2016-06-23 11:59:45 -0500 received badge  Notable Question (source)
2016-02-06 09:21:30 -0500 received badge  Self-Learner (source)
2015-11-12 22:27:47 -0500 received badge  Good Question (source)
2015-11-05 17:47:08 -0500 asked a question consistency limits

I am using two UR5 arms with moveit. Right now I am using them in simulation only. I have set up moveit with the setup assistant and try to run it.

I get an error message, even when the arms are standing still from the kinematics solver KDLKinematicsPlugin:

Consistency limits be empty or must have size 6 instead of size 8

Does anybody know what causes this problem?

2015-11-04 19:03:59 -0500 answered a question XBox Joystick - detecting signal loss

There is no such option as far as I know. If the connection is lost, the last valid command just keeps being sent.

A workaround would be a deadman switch that needs to be toggled. So basically before processing a joystick command, you check if the deadman switch has been toggled during a certain interval (maybe 0.2 secs). If it has not been toggled you dont send the joystick command to the robot. In that case if you lose connection it is the same as if you don't toggle the deadman. The side-effect is that you always have to toggle the deadman switch while driving the robot with your joystick.

2015-10-28 11:01:33 -0500 received badge  Enthusiast
2015-10-23 02:00:00 -0500 received badge  Nice Answer (source)
2015-10-21 20:16:22 -0500 received badge  Nice Question (source)
2015-10-21 20:05:59 -0500 received badge  Scholar (source)
2015-10-21 20:02:07 -0500 answered a question How to prevent costmap to clear obstacles that are too close for the kinect to detect?

I could fix the issue.

I am using the pointcloud_to_laserscan node and tell the node to transform the pointcloud to a frame where the pointcloud actually begins (at the border of the deadzone). The next step is to tell the costmap that my newly created sensor sits at the border of the deadzone. What happens is that the raytracing starts from the origin of this newly created laser. That way obstacles in the deadzone don't get raytraced away, because the fake laser is in front of the deadzone.

image description

2015-10-21 15:46:40 -0500 commented answer How to prevent costmap to clear obstacles that are too close for the kinect to detect?

Thanks! I'd like to have a specific area around the robot which is not touched by the costmap. So that obstacles that have been observed in this area when the robot was somewhere else would not get raytraced out. I agree, it is a custom problem

2015-10-21 15:02:42 -0500 commented answer How to prevent costmap to clear obstacles that are too close for the kinect to detect?

@David Lu would you know what was the easiest way to tell the costmap that a certain corridor of a voxel layer is restricted for clearing obstacles? (The area right in front of the robot, where he is blind)

2015-10-21 11:45:05 -0500 answered a question Filtered odometry drifts when no measurements received

Based on your experience in your updated answer I would intuitively say that the noise parameter of your odometry is off. So you would need to tune your Kalman filter. The tuning parameters of a Kalman filter are process and observation noise levels (assuming your model is correct and can be simplified well enough as having gaussian noise).

Could you try and increase the process noise level of the odometry? Because you don't want the filter to drift between measurement updates (of your visual pose), it should always yield the best possible estimate. I don't know the package, so I can't tell you exactly where to change the parameters.

Remember: tuning a kalman filter can be tedious..

2015-10-21 11:18:01 -0500 received badge  Supporter (source)
2015-10-21 11:13:23 -0500 received badge  Editor (source)
2015-10-21 04:27:34 -0500 received badge  Teacher (source)
2015-10-20 19:11:17 -0500 answered a question doubt regarding cmd_vel published by move_base

Your planner samples trajectories and scores them. With the vx_samples and vtheta_samples you define how the linear and angular velocity space is sampled.

That means: if you have vx_samples: 3and vtheta_samples: 20 the planner simulates trajectories for 3 different velocity inputs and 20 different angulat velocity inputs. So in the end a trajectory of one of the discrete vx_samples is scored, i.e. /cmd_vel is set to one of the 3 distinct vx_samples.

What you can do is increase vx_samples to something like 10 ~ 20 to allow for a finer discretization of your velocity sample space, i.e. the base controller can choose between 10 ~ 20 different velocities.

2015-10-20 18:51:34 -0500 answered a question Navigation stack hits obstacles

As far as I understand it, the problem is that your obstacles do not get inserted in the costmap.

Could it be that you need to tell the local costmap that it finds the obstacle in the 'obstacles' list from the costmap_common_params.yaml file. Or you could try and change the costmap_common_params.yaml` file to

obstacle_range: 2.5
raytrace_range: 3.0
footprint: [[0.5, -0.45], [-0.5, -0.45], [-0.5, 0.45], [0.5, 0.45], [0.6, 0.0]]
#robot_radius: ir_of_robot
inflation_radius: 0.55
observation_sources: laser_scan_sensor point_cloud_sensor
laser_scan_sensor: {sensor_frame: base_laser, data_type: LaserScan, topic: /scan, marking: true, clearing: true}
point_cloud_sensor: {sensor_frame: base_laser, data_type: PointCloud, topic: /my_cloud, marking: true, clearing: true}

So basically simply get rid of the 'obstacles' list.

Update Did you use the corect frame ID of the laser in the costmap_common_params.yaml? As far as I know the LMS1xx node publishes the LaserScan in the frame 'laser' by default, not 'base_laser'. You need to adjust the sensor_frame parameter if you haven't done it already. So something like this:

inflation_radius: 0.55
observation_sources: laser_scan_sensor point_cloud_sensor
laser_scan_sensor: {sensor_frame: laser, data_type: LaserScan, topic: /scan, marking: true, clearing: true}
2015-10-20 18:25:44 -0500 received badge  Famous Question (source)
2015-10-20 17:28:12 -0500 answered a question catkin_make package that uses mavros_msgs

Did you install the mavros_msgs package? The fact that catkin doesn't find it is suspicious.

You could try and find the package via command line: $ catkin_find mavros_msgs. If you can't find it you will need to get the package first.

2015-10-20 15:42:13 -0500 received badge  Notable Question (source)
2015-10-20 15:10:35 -0500 received badge  Popular Question (source)
2015-10-20 15:08:45 -0500 commented answer How to prevent costmap to clear obstacles that are too close for the kinect to detect?

The thing is that the kinect is elevated and tilted, so if it gets close to low obstacle on the floor it just "looks" over the object. The scan reading is still valid then, because it just looks beyond the obstacle. Ideally, I find a way to prevent costmap from clearing obstacles in the dead zone.

2015-10-20 13:41:45 -0500 received badge  Student (source)
2015-10-20 12:29:59 -0500 asked a question How to prevent costmap to clear obstacles that are too close for the kinect to detect?

Hi!

I am using the ROS Navigation stack to navigate through known space. The robot is equipped with a 2D Laserscanner and a Kinect.

I want to use the kinect for obstacles that are too low or too high for the laser to detect. So for the costmap I created a voxel_grid with 3 voxels. The bottom and the top layer for the kinect and the middle layer for the Laserscan. I am converting the kinects pointcloud into two laserscans before using them with costmap.

Everything works fine except for one problem: when I approach an obstacle that only the kinect can detect (e.g. a small box on the floor) it works fine until I'm too close for the kinect to detect it (I'm in the deadzone of the kinect). What happens is that the obstacle gets raytraced out and the obstacle is erased from the costmap.

I tried different approaches, such as setting the kinects "clearing" parameter to false, but then dynamic obstacles (e.g. people) are stuck in the costmap.

Is there a straightforward way to prevent costmap to raytrace out obstacles that are too close?

The robot/kinect setup looks like this:

image description

Thanks for your help!