ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2018-06-13 11:13:02 -0500 | received badge | ● Good Answer (source) |
2018-06-13 11:13:02 -0500 | received badge | ● Enlightened (source) |
2016-03-07 23:15:12 -0500 | received badge | ● Nice Answer (source) |
2016-01-06 00:14:59 -0500 | received badge | ● Necromancer (source) |
2015-08-23 14:32:32 -0500 | commented answer | turtlebot apt-get update has generated an error on minimal.launch I had the same problem on a cubieboard2 - and rebuilding robot_model from source fixed it indeed. Thanks! |
2015-02-15 12:02:25 -0500 | answered a question | pocketsphinx excessively verbose Try playing with the E.g. from http://stackoverflow.com/questions/17... : Or write to file: In python (what ros pocketsphinx is using), from https://mattze96.safe-ws.de/blog/?p=640 : |
2015-02-13 20:38:46 -0500 | received badge | ● Famous Question (source) |
2015-02-13 13:56:28 -0500 | received badge | ● Self-Learner (source) |
2015-02-13 13:26:39 -0500 | answered a question | How to power off Kinect from software I found that the command works, but it needs to be repeated twice in a well timed way. What works reliably for me is the following: Note that the 2.5 seconds sleep is crucial - 2 seconds will not work (too fast), 3 seconds neither (too slow). This was so on both machines on which I tested this. It seems that the sweet spot is when the Kinect USB Hub is detected, but not yet any of the actual Kinect devices: When troubles finding the sweet spot, you can just "bruteforce" it, as follows: Finally, to re-enable: |
2015-02-09 12:23:35 -0500 | commented answer | How to power off Kinect from software ... which is in line with what @MichaelKorn reports. So it seems to make sense to try disabling the kinect completely - as that would save about 0.38A |
2015-02-09 12:23:35 -0500 | received badge | ● Commentator |
2015-02-09 12:22:29 -0500 | marked best answer | How to power off Kinect from software I have a Kinect connected to the onboard Linux-based PC on my robot. However, in many scenarios I am not using the Kinect, so I would like to be able to turn it off from command line to save power. I tried the following:
This works, in the sense that the power light of the Kinect goes off for a moment, and the USB device is removed. However, the Kinect gets redetected within a second - re-enabling it. Following results in the same behaviour:
Note that I am using a 3.4 kernel, and following does not work:
Is there a way to disable the Kinect from software? Thank you! |
2015-02-09 12:22:29 -0500 | commented answer | How to power off Kinect from software I just detected a major flaw in my current measurement setup, and retested: Kinect connected to power supply only: 0.00 A (baseline) - Kinect connected to power supply, and to USB: 0.50A stabilizing at 0.38A - Kinect driver running: 0.38A - Kinect app running: >0.45A |
2015-02-08 02:12:55 -0500 | answered a question | arduino robot The package ros_arduino_bridge is probably what you are looking for if you want to combine Arduino and ROS. See also related blogpost by the author here. Also, I suggest picking up ROS By Example - Volume 1 as a general introduction to ROS. While not focused on Arduino, it is the best ROS introduction I know (and accidentally, is written by the same author as ros_arduino_bridge). EDIT: I should also point out that ros_arduino_bridge is basically a ROS driver for Arduino - it allows you to control your Arduino board from within the ROS framework. You will still need an additional, higher-end system to run the actual ROS nodes (Arduino hardware is not sufficiently powerful for that). |
2015-02-07 22:45:38 -0500 | received badge | ● Self-Learner (source) |
2015-02-07 22:45:38 -0500 | received badge | ● Teacher (source) |
2015-02-07 03:34:41 -0500 | commented question | Good map with hector_mapping and with odometry-only - poor map with gmapping @jseal interesting, thanks! Did you ever try setting the minimumScore parameter? Setting this to a very high value (see below) works very well for me. |
2015-02-07 03:32:32 -0500 | commented answer | Good map with hector_mapping and with odometry-only - poor map with gmapping The documentation for minimumScore ("Can avoid jumping pose estimates [...] when using laser scanners with limited range (e.g. 5m).") suggests this might be a common problem with xv11 lidars (5m range). Have other xv11 lidar users a similar experience with gmapping? |
2015-02-06 23:56:27 -0500 | received badge | ● Famous Question (source) |
2015-02-06 04:44:46 -0500 | answered a question | Good map with hector_mapping and with odometry-only - poor map with gmapping I did a lot of different gmapping experiments - results can be found here. Summary of what I tried that did NOT help:
Summary of what did help: I had most success with setting the minimumScore to a very high value, as suggested here, and combining that with setting srr/srt/str to 0, and stt to 0.1 (to reflect rotational error, but near-perfect translational behavior), i.e.: (Note that the maxUrange, and particles values are the defaults; the xmin/ymin/xmax/ymax just make for a smaller map, by specifying much lower starting values than the default 100m) The result is this (pgm): Note that results change across runs - sometimes results would be slightly worse. In particular, the "_srr:=0 _srt:=0 _str:=0 _stt:=0.1" seems not very significant - with all these set to 0, and particles set to 1, that is with: I get this (pgm): Slightly different, but well within the variance I get by running the previous command. Lastly, by simply using the defaults for srr/srt/str/stt, i.e.: I get this (pgm): Again slightly different, but well in line with what I get by running the first command. So in summary, changing minimumScore to a very large value solves my problem. (I tried 500, 750, 1000, 10 000, with 10 000 best) So this seems to solve my issue. I am still open to better solutions though, as this seems a bit of a hack, disabling a large part of gmapping's 'intelligence'. |
2015-02-06 03:01:55 -0500 | commented answer | Good map with hector_mapping and with odometry-only - poor map with gmapping @AlexR, thanks I went through that answer, and in particular the reference to this answer seems promising - still testing, but will report back shortly. |
2015-02-06 01:20:32 -0500 | commented answer | IMU ros orientation @asusrog It is probably best to convert to the ROS orientations at the moment you put the information from your imu into a ROS IMU msg. If you have full control over the firmware of your imu, you could also change it there - but this might be harder to maintain. |
2015-02-05 07:15:01 -0500 | received badge | ● Notable Question (source) |
2015-02-03 11:45:37 -0500 | answered a question | IMU ros orientation |
2015-02-03 11:22:23 -0500 | received badge | ● Popular Question (source) |
2015-02-01 16:18:34 -0500 | received badge | ● Student (source) |
2015-02-01 12:40:25 -0500 | received badge | ● Editor (source) |
2015-02-01 12:39:58 -0500 | answered a question | speech recognition and ROS Have look at this Pi Robot blog entry. However, this is rather outdated, assuming ROS Electric. I highly recommend obtaining the latest copy of ROS by Example - Volume 1, which has a chapter dedicated to this ("9. Speech Recognition and Synthesis") - see Table of Contents. |
2015-02-01 10:33:09 -0500 | edited question | Good map with hector_mapping and with odometry-only - poor map with gmapping Using an XV11 lidar and calibrated odometry, I am able to build a good quality map using both hector_mapping (using scan results only) and pure odometry (using odometry only). However, I am unable to get a good map using gmapping (which combines scan results and odometry). Odometry: My robot has odometry that passes the tests listed in the navigation tuning guide - linear translation is perfect, rotation error seems to be reasonable as well. Lidar: Additionally, it is using an XV11 lidar as laser scanner, with the xv_11_laser_driver package. Bagfile: I've recorded a short bag file with /odom, /tf and /scan topics, using following command:
I then compressed the bagfile ( Map results: If I transform this bag file into a map using hector_mapping, the result looks perfect: EDIT: Here are links to the hector_mapping generated pgm and yaml files. Moreover, if I simply display the laserscan results in rviz in the /odom frame, with an infinite decay time, I get a very nice map as well: Now, when I try to run this through gmapping, the map looks very poor: EDIT: Here are links to the gmapping generated pgm and yaml files. Commands: To generate the gmapping map, I used: To generate the hector_mapping map, I used: Launch file is available here. PDF graph of tf tree is available here. Question: How come gmapping gives such poor results, compared to the two other approaches? Any suggestions to improve the gmapping results would be much appreciated. Thank you, Kristof |