ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2017-08-14 02:06:04 -0500 | marked best answer | openni2_tracker problems Hi ! I am currently using ROS Groovy and I recently decided to move from openNI 1.5 to openNI 2. I tried openni2_tracker and managed to make it work, but I have two problems :
Does someone have some information ? Thanks, Caroline |
2016-08-08 00:25:07 -0500 | marked best answer | Move_base and costmap_layers Hi. I would like to use a new layer in the costmap. I followed this tutorial : http://wiki.ros.org/costmap_2d/Tutorials/Creating%20a%20New%20Layer to create a new layer. Now my layer is defined but when I launch move_base the plugin is not loaded. I didn't find where I should add my layer... Here is my launch file : </launch> Here is the output in the console : I was expecting the plugin simple_layer to be used. But it's not. What did I miss ? Thanks, Caroline |
2016-03-17 13:54:34 -0500 | marked best answer | Gmapping localization problem Turtlebot Hello, I am actually experimenting the gmapping package with the Turtlebot2. To create the map I am using a Hokuyo Laser and not the Kinect. Everything works fine at the beginning, but at one point the Turtlebot has a localization problem. It's at the right place and one second later it moves several meters away in the map in rviz. Is seems that at the same time I have this error message : [ERROR] [1363969960.064180503]: Transform from base_link to odom failed And after that the map is not precise anymore. Do you know what I can do to solve that ? How can this transform fail ? Why ? Thanks for help, Caroline |
2016-03-02 18:21:58 -0500 | marked best answer | Error starting Turtlebot 2 Hey, I am currently using a Turtlebot 2 with ROS groovy. When I start the script minimal.launch in the package turtlebot_bringup on the robot's laptop, I get the following error : It seems that the problem come from the bumper2pc node, because if I comment the include tag corresponding to the bumper2pc I don't get the error... Moreover if I connect the Kobuki base to my desktop computer, I can start the script and everything works perfectly. All packages are up to date on both computers. How can I solve this problem ? Caroline |
2016-01-31 09:20:04 -0500 | received badge | ● Good Question (source) |
2015-11-28 16:17:13 -0500 | marked best answer | OpenNi/NITE : Segmentation fault with openni_tracker Hello, I am currently using the openni_tracker package to be able to track a person with a Kinect. I am working with Ubuntu 12.04 and ROS groovy. I managed to install the package and I needed to install NiTE v1.5.2.21 to make it work (see this ticket : http://answers.ros.org/question/42654/openninite-incompatible-in-fuertegroovy-on-precise/) But the program crashes with a segmentation fault when :
I run it with gdb and got these informations about the error : I found the same error last year in this ticket : (https://groups.google.com/forum/#!msg/openni-dev/259RT7dVSy4/tz1mwiFgejEJ) But nobody seems to had a solution by this time, except removing the NITE library libXnVFeatures_1_5_2.so, and I would like to avoid this solution... Does someone have some news/solutions about this problem ? Caroline |
2015-11-28 16:16:06 -0500 | marked best answer | Kobuki : eigen unaligned array Hi everyone, I am currently working with a Turtlebot 2 with a Kobuki base. I have a problem when I want to test the auto docking following the tutorial here : http://ros.org/wiki/kobuki/Tutorials/Testing%20Automatic%20Docking When I try to launch the script compact.launch I have the following error : In the same way when I launch the file minimal.launch of the turtlebot_bringup package I get the same error : (more) |
2015-05-28 22:26:17 -0500 | marked best answer | Dynamixel problem : No motors found Hi ! I just got a new computer on which I installed ros groovy. I am trying to use Dynamixel motors but I have an error. This error doesn't appear each time I launch the script so it's why I don't understand what's happening. Below is the error : Here is my launch file : The strange thing is that I can launch the script several times and got the error, and the second after (without changing anything) I launch it again and it works perfectly well. So the motors are connected and recognized by the computer. Does someone have an idea about why I have this problem ? Thanks, Caroline |
2015-03-30 04:43:22 -0500 | received badge | ● Favorite Question (source) |
2014-04-20 06:53:47 -0500 | marked best answer | Bumper in the gmapping costmap (Turtlebot) Hello, I am trying to use the bumper scans to avoid small obstacles with gmapping, I added this bump scan in the costmap_common_params.yaml file in the turtlebot_navigation package. Below is what is happening : - When the robot bumps in an obstacle, the obstacle is put in the costmap - When the robot go backward, the bumper is not pressed anymore and the laser sensor is not seeing the obstacle so the obstacle is instantaneously removed from the costmap. I would like the obstacle to stay longer in the costmap so that the navigation stack would change the path to avoid it. Is it possible to modify this in the costmap ? Thanks, Caroline UPDATE : yaml file Where would you add the min_height parameter ? I tried with "scan: {data_type: LaserScan, topic: /scan, marking: true, clearing: true, min_obstacle_height:0.30}" but the costmap is not existing anymore in rviz. |
2014-04-20 06:53:43 -0500 | marked best answer | Bumpers with Turtlebot 2 I would like to use a safety controller with my turtlebot 2, so that I can take into account bumper's state when navigating. Is there a safety controller that has been created for the turtlebot 2 or should I use the same one than for the kobuki base ? Is there a nodelet manager started in turtlebot_bringup/minimal.launch ? UPDATE : When I tried to do this tutorial : http://www.ros.org/wiki/kobuki/Tutorials/Write%20your%20own%20controller%20for%20Kobuki using only the kobuki_node/minimal.launch it works. But I would like to use turtlebot_bringup/minimal.launch but no nodelet manager seems to be started... Can someone help me ? |
2014-04-20 06:53:39 -0500 | marked best answer | No depth data with kinect on Turtlebot 2 Hello, I am using a Turtlebot2 with groovy. I am trying to follow the 3D visualization tutorial in Turtlebot tutorials but when I want to visualize things in rviz I have the image of the kinect but no depth data and no laser scan. I don't have any error message in rviz, but nothing appears on the screen. The problem is that when I connect the kinect to my desktop computer and not to the turtlebot's laptop (and use the same launch files) everything is working fine. Do you have some clues ? Caroline |
2014-04-20 06:53:29 -0500 | marked best answer | Hokuyo_node with groovy Hello, I am trying to use an Hokuyo laser with groovy. I followed the tutorial http://www.ros.org/wiki/hokuyo_node/Tutorial/UsingTheHokuyoNode but I have a problem when I want to see the laser scan in Rviz. The config file /hokuyo_test.vcg was created for fuerte and is not used anymore with groovy. So when I launch Rviz nothing happens. Any advice ? Caroline |
2014-04-20 06:53:09 -0500 | marked best answer | Use a known map to do navigation Hi ! Is it possible to use a known map (saved as pgm and yaml) to do localization and navigation in rviz ? I am actually using the turtlebot on Gazebo simulator. I created a map using gmapping and the keyboard teleop. And I saved this map using the map_server package. I got two files (one pgm and one yaml). I tried to follow the tutorial about the pr2 (pr2_simulator/Tutorials/UsingCustomMapForSimulation) and adapt it to my case. When I start rviz I can see my map but the robot is not able to move and the navigation is not working. Furthermore, I can see errors on the fixedFrame /map. Which parameters should I change ? Below is my launch script : Thanks for help. Update: The global error is: "Fixed frame /map doesn't exist." And in the RobotModel I also have some errors: "No transform from [base_footprint] to [/map] No transform from [base_link] to [/map] No transform from [camera_link] to [/map] ..." for each frame. |
2014-03-11 08:11:06 -0500 | received badge | ● Famous Question (source) |
2014-02-03 05:19:01 -0500 | received badge | ● Famous Question (source) |
2014-01-28 17:32:06 -0500 | marked best answer | OpenNI 2 and Groovy Hi ! I am currently using ROS Groovy with my robot. I was using OpenNI and NITE 1.5 but I had some bugs so I would like to change to OpenNI 2. I found the drivers openni2_camera on Groovy. But I was also using openni_tracker with Groovy, and the new version openni2_tracker is only available for Hydro. Does someone know what I can do ? Is the Hydro version compatible with Groovy ? Same thing for using the Kinect, I saw that openni2_camera is not working for Kinect, so I will try with freenect_stack, but will openni2_tracker work with Kinect and freenect_stack ? Thanks for information, Caroline |
2014-01-28 17:31:56 -0500 | marked best answer | Layered costmap problem Hi. I am currently using the layered costmap found on the branch "costmap-plugins-renamed-layers" of the navigation stack repository. I would like to use the layered costmap and add a new layer to it. If I keep the default parameters I can see a costmap with the obstacles but as soon as I put a list of plugins in the yaml file I can't see the obstacles anymore and no error message appears. Here is my yaml file : I tried the simple layer tutorial and this is working; when I add it to the list I can see the result in the costmap, so the problem seems to come from the Obstacle layer. Did anyone have this problem before ? EDIT 1 : Other yaml files costmap_common_params.yaml global_costmap_params.yaml Caroline |
2014-01-28 17:30:48 -0500 | marked best answer | Multi threading Hi, I am trying to understand different options to use multi threading with ROS. I found different things : * Use MT nodelet interface (with getMTNodeHandle()) * Use MultiThreadedSpinner or AsyncSpinner * Use boost::thread I have few questions about that.
Thanks for help, hope someone will be able to help me clarify that. Caroline |
2014-01-28 17:30:23 -0500 | marked best answer | Kobuki auto-docking error Hi, I am currently working with a Turtlebot 2 with a Kobuki base, I tried to follow the tutorial http://ros.org/wiki/kobuki/Tutorials/TestingAutomaticDocking I can start the Kobuki node with no error, but as soon as I start the auto-docking nodelet I get this error : Did someone experience the same problem ? Or does someone have some idea why I get that ? Caroline |
2014-01-28 17:30:08 -0500 | marked best answer | Kinect not recognized with openni.launch Hello, I am currently using two different computers to work with my robot, both of them are on ROS Groovy. I tried to use the Kinect with those computers using "roslaunch openni_launch openni.launch", but it works on one of the computers but not the other one. The only difference between the two computers is that I did some updates on the one that is not working well. The error message I get is : "No device connected" I read some other questions about this problem, so my Kinect is connected on a USB 2.0, and I tried "rmmod -f gspca_kinect" but it's not working. I read those questions : http://answers.ros.org/question/60562/ubuntu-12042-and-openni_launch-not-detecting-kinect-after-update/ http://answers.ros.org/question/60576/microsoft-kinect-not-connected/ But it seems that nobody found a solution to the problem for the moment. Caroline |
2014-01-28 17:30:07 -0500 | marked best answer | Turtlebot2 power for laptop Hello, I would like to use the battery of the Kobuki base of my turtlebot to give power to the laptop. I would like the laptop to take power from the Kobuki only when the Kobuki base is on the loading station. For that I need to activate or not the external power for the Kobuki. I found a function "Command Command::SetExternalPower(const DigitalOutput &digital_output, Command::Data ¤t_data)" in the file kobuki_driver/src/driver/command.cpp. And if I am right, this function allows us to activate or not the external power of the Kobuki. But I don't know how to use this function. Is there any documentation somewhere ? or examples ? Or should I use another function ? Thanks for help. Caroline |
2014-01-28 17:30:05 -0500 | marked best answer | Move a robot head toward a 3D point Hello, I am working with a Kinect and two dynamixel motors (to have a pan/tilt). I would like to send commands to the motors so that the kinect look at a point in 3D space. I managed to have the Kinect look in a specified direction using the pan motor. But that was only in 2D space so I used Euler angles and trigonometry. Using the same in 3D space would be much more complicated. I would like to know if there are some tools to help doing such things ? I heard that maybe using tf could help me but I don't know how I should do that. Can someone help me ? Thanks Caroline |
2014-01-28 17:29:50 -0500 | marked best answer | Hokuyo URG-04LX-UG01 Measuring area I am tying to use the Hokuyo node to get scan data from the Hokuyo laser URG-04LX-UG01. When I visualized the laser scan in rviz, I only have a measuring area from -90° to 90°, and the measuring area of the laser should be 240°. So I tried to change the parameters min_angle and max_angle of the hokuyo node but the maximum measuring area that I get is from -120° to +90°. Do you have any idea why I can't get data between 90° and 120° ? Thanks for help, Caroline |