ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

micmac's profile - activity

2017-03-23 10:28:26 -0500 received badge  Taxonomist
2015-10-24 20:12:35 -0500 received badge  Nice Question (source)
2014-01-28 17:29:38 -0500 marked best answer is amcl required together with laser_scan_matcher for navigation ?

It it required to run amcl together with laser_scan_matcher to achieve navigation ?

laser_scan_matcher already gives the pose, would it be possible then to launch move_base alone and set initial pose and 2D target in rviz ? I use the 2dnav_neato node that loads the map server, amcl, and move_base. So I thought I could remove amcl and leave only laser_scan_matcher...

2014-01-28 17:29:33 -0500 marked best answer scan_tools runtime error undefined symbol cblas_dtrsv ubuntu 12.04.2LTS

I have downloaded the scan_tools source code from git: it did build successfully, but at runtime I get:

"laser_scan_matcher_node: symbol lookup error: /usr/lib/libgsl.so.0: undefined symbol: cblas_dtrsv".

When I run ldd on the laser_scan_matcher_node binary, I see that libgsl.so.0 is linked, but NOT libgslcblas.

I am on ubuntu 12.04.2LTS.

What could I do to make it work, please ?

2014-01-28 17:29:31 -0500 marked best answer rviz segfault core dumped (tried all OGRE_RTT_MODEs)

Hi,

I am using ROS fuerte on ubuntu 12.04.2LTS I was running rviz happily until I did an apt-get update/upgrade. Many ROS packages have been updated, then when I run rviz I get the following error:

micmac@xubuntu:~$ rosrun rviz rviz -d /home/micmac/fuerte_workspace/neato_robot/XV11-Kinect.vcg Erreur de segmentation (core dumped)

I tried to set OGRE_RTT_MODE (export) to all supported modes, without success. Still segfault.

Could you please help me ?

2014-01-28 17:29:31 -0500 marked best answer odometry precision required for a good navigation

I have been using a neato XV-11 base with an Asus Xtion Pro Live 3D sensor mounted on it for mapping and navigation.

Now I am willing to move to another platform: a cmRobot "the Element" board for motors control and sensors readings.

This board would allow me to use a large range of motors and motors encoders.

But what is the motor encoder precision required for a good odometry, and successful navigation / mapping ?

I have found this wheels and encoders kit: http://www.active-robots.com/motors-wheels/wheel-kits/motor-mount-wheel-kit.html

But the specifications states: "36 encoder positions per revolution; approx 0.5 inch resolution with 6" diameter wheel".

It does not seem to be very precise (36 encoder positions per revolution). For example, this other wheel kit: http://www.active-robots.com/motors-wheels/wheel-kits/complete-robot-drive-system.html has "360 Encoder counts per output shaft turn." (as said in the downloadable EMG30 documentation)

So how to chose the right encoders and wheels for indoor navigation (small office) ?

Your advices will be much appreciated.

2014-01-28 17:28:30 -0500 marked best answer how to make the robot rotate towards the path direction

Hello, I have set up the Navigation stack on a Neato xv-11 with an Asus xtion pro live mounted on it (the neato lidar is dead, sadly).

slam (amcl) is functioning correctly, the planner (TrajectoryPlannerROS) also computes a good path to follow, but I'm having troubles finding the right parameters in order for the robot to start following the new plan.

Let me explain: when the robot is already headed towards the path direction, it has no problem to start going along it and reach the goal. But when the robot is in an opposite direction to the new plan, I wish it could rotate first so as to be aligned to the path direction before following it. Instead it start rotating a bit left, then right, as if it was searching how to reach the path, and it also advances slowly in the wrong direction doing so. Until it eventually reaches an obstacle and get stuck.

I tried to set the path bias and the goal bias as well as I could, I understand that the goal bias must be high enough for the robot to head towards it, but if I set it too high it doesn't follow the path close enough (so my path bias=26 and goal bias=20)

If you had any tips on how to solve this problem, that would be greatly appreciated.

Another question I have regarding navigation is: how to configure NavFn in order to issue a new plan whenever the robot get stuck on obstacles ? sometimes it does so, but some other times it just tries to follow the current plan on and on and bumps into the same obstacles without issuing a new plan.

2013-10-22 02:18:15 -0500 received badge  Famous Question (source)
2013-09-27 03:47:38 -0500 answered a question robot goes straight to the goal when it is at its left, but always rotates in the wrong direction when the goal is at its right

Ok it was a problem of base local planner parameters, I could not pinpoint the exact ones yet, but probably velocity/acceleration constraints. I copied the base local planner parameters from a previous configuration, and now it works.

2013-09-27 03:45:23 -0500 received badge  Famous Question (source)
2013-09-25 00:37:52 -0500 received badge  Notable Question (source)
2013-09-24 20:22:35 -0500 received badge  Popular Question (source)
2013-09-24 04:11:52 -0500 asked a question robot goes straight to the goal when it is at its left, but always rotates in the wrong direction when the goal is at its right

Hi,

I have a non holonomic robot, a neato XV-11 base equipped with an asus xtion pro live (kinect equivalent).

I have set up the navigation stack, and am using TrajectoryPlannerROS as a base local planner, with DWA set to true.

The path planning and obstacle inflation work good (as can be seen in RViz), but when it moves my robot shows a weird behavior:

Lets say the robot is pointing towards some direction. If I set the goal in rviz somewhere further but on the left side of the robot, it goes straight to the goal, with a tendency to overturn a bit too much on the left while doing so. Then when the goal is reached, it rotates correctly to head to the goal direction.

But when I set the goal somewhere further but on the right side, the robot will not start to advance or turn towards the goal immediately, it would slowly turn in the wrong direction (so rotating to the left) until it makes a full turn and then starts to move, and reaches the goal correctly.

So it seems that all movements tend to overturn to the left.

I have checked all my transforms (is the laser transform from the base positionned correctly? and so on...), I have also checked the odometry (sending simple goals like moving forward 1 meter, and checking the rostopic echo output of /odom) but did not found anything wrong.

AMCL also seems to work nicely, as the laser more or less always catches the contour of the given map.

I'd be glad and thankful to your suggestions.

2013-06-17 05:28:00 -0500 received badge  Famous Question (source)
2013-06-12 04:20:21 -0500 received badge  Popular Question (source)
2013-06-12 04:20:21 -0500 received badge  Notable Question (source)
2013-05-02 20:49:06 -0500 received badge  Notable Question (source)
2013-04-29 12:16:28 -0500 received badge  Famous Question (source)
2013-04-28 22:15:39 -0500 received badge  Popular Question (source)
2013-04-26 00:17:55 -0500 answered a question ccny_rgbd bad odometry with amcl

I found my problem: it is because I am using a fake laser scan from an asus xtion, and this one worked only if I used original openni_launch instead of the ccny_openni_launch included in the package.

So I did remap the topics in the fake laser launch file so that the cloud points to /depth/points (as provided by ccny_openni_launch) instead of /camera/depth/points (as provided by original openni_launch), and it worked fine with the new ccny_openni_launch.

And also changing the calibration file to "calibration_asus_xtion_pro" in the cnny_openni_launch openni.launch file helped I guess.

2013-04-22 22:32:10 -0500 received badge  Famous Question (source)
2013-04-22 22:23:42 -0500 asked a question ccny_rgbd bad odometry with amcl

Hello,

I've been using fovis_ros mono_odometer with an asus xtion pro live successfuly, then I found ccny_rgbd, which is more lightweight.

The problem I'm having is that while fovis_ros works great, ccny_rgbd on the other hand gives me terrible odometry with the same hardware: amcl laser drifts very quickly out of the map, so it can not localize.

The only parameter I have changed from default visual_odometry.launch file is the "base_frame" which I changed from camera_link to base_link.

I also provide a base_link to camera_link transform manually, so that this transform reflects the position of the camera relatively to the base:

rosrun tf static_transform_publisher -0.035 0 0.12 0 0 0 base_link camera_link 40

So the camera is not exactly centered on the robotic base, it is a few centimeters distant from the middle of the robot.

This transform works great when used with gmapping, amcl, fovis_ros... so it looks ok to me.

I am wondering whether ccny_rgbd takes in count the relative position of the camera correctly with this transform. Is it supposed to work if the camera is not centered on the robotic base ?

2013-03-27 06:37:40 -0500 received badge  Notable Question (source)
2013-03-25 11:49:34 -0500 marked best answer can not save map with octomap_server / octomap_saver from rgbdslam ROS Fuerte

Hi,

I have seen on this ROS Answer topic that someone already had trouble saving maps from rgbd slam using octomap. I have exactly the same problem, I click on "Graph" -> Send Model (ctrl+M) inside the rgbdslam window, the model and the map seem to be sent correctly (30 out of 30 nodes, for example), but when I run "rosrun octomap_server octomap_saver mymap.bt" there is only 1 node saved to the map file (so empty map in octovis).

Let me explain my setup : rgbdslam from the wiki svn, adapted with many efforts to be used in fuerte and built with rosmake. I can capture a scene (by pressing space), but no map can be saved with octomap_saver as said above.

I have also set up the octomap_mapping.launch file accordingly (fixed frame "openni_camera" and "cloud_in" remapped to "/rgbdslam/batch_clouds").

I am also using the experimental branch of octomap. Here are the steps that I did to build it in ROS Fuerte:


downloaded mapping_msgs from svn (ros wiki) downloaded point_cloud_perception from svn (ros wiki) downloaded geometric_shapes_msgs from svn (ros wiki) downloaded octomap_mapping-experimental from svn https://alufr-ros-pkg.googlecode.com/svn/branches/octomap_mapping-experimental/

Error : /home/micmac/fuerte_workspace/octomap_mapping-experimental/octovis/src/octovis/TrajectoryDrawer.cpp:63:5: erreur: ‘GLUquadricObj’ was not declared in this scope

> Edited /home/micmac/fuerte_workspace/octomap_mapping-experimental/octovis/src/octovis/TrajectoryDrawer.h to add #include <GL/glu>

> and added GL and GLU to target_link_libraries to octomap_mapping-experimental/octovis/CMakeLists.txt

Error: /home/micmac/fuerte_workspace/octomap_mapping-experimental/octomap_ros/src/conversions.cpp:45:114: erreur: explicit instantiation shall not use ‘inline’ specifier [-fpermissive] Error: /home/micmac/fuerte_workspace/octomap_mapping-experimental/octomap_ros/src/conversions.cpp:46:125: erreur: explicit instantiation shall not use ‘inline’ specifier [-fpermissive]

> Edited /home/micmac/fuerte_workspace/octomap_mapping-experimental/octomap_ros/src/conversions.cpp to remove 'inline' specifier on both lines 45 and 46

Then rosmake octomap_mapping-experimental : Built 39 packages with 0 failures


So both RGBDSLAM and octomap_mapping-experimental seem to run fine, but I can't save any map using "rosrun octomap_server octomap_saver mymap.bt"

I have also tried to publish a static transform from /rgbdslam/batch_clouds to octomap_server by adding this line to octomap_mapping.launch:

<node pkg="tf" type="static_transform_publisher" respawn="false" name="octomap_tf_conversion" args="0 0 0 0 0 0 /rgbdslam/batch_clouds /octomap_server 10" output="screen"/>

I must also precise that I am using the Asus Xtion Pro live cam, so the topics may be a little different here and there (camera_rgb_optical_frame instead of openni_rgb_optical_frame), but I also tried all the required changed without success.

Would you please have any advice to give me on how to save a map from rgbdslam with octomap_mapping ? I've been spending monthes on this without success so far...

2013-03-24 07:30:21 -0500 received badge  Famous Question (source)