ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Kristof's profile - activity

2018-06-13 11:13:02 -0500 received badge  Good Answer (source)
2018-06-13 11:13:02 -0500 received badge  Enlightened (source)
2016-03-07 23:15:12 -0500 received badge  Nice Answer (source)
2016-01-06 00:14:59 -0500 received badge  Necromancer (source)
2015-08-23 14:32:32 -0500 commented answer turtlebot apt-get update has generated an error on minimal.launch

I had the same problem on a cubieboard2 - and rebuilding robot_model from source fixed it indeed. Thanks!

2015-02-15 12:02:25 -0500 answered a question pocketsphinx excessively verbose

Try playing with the logfn parameter:

E.g. from http://stackoverflow.com/questions/17... :

pocketsphinx_continuous -logfn /dev/null

Or write to file:

pocketsphinx_continuous -logfn [file]

In python (what ros pocketsphinx is using), from https://mattze96.safe-ws.de/blog/?p=640 :

self.asr.set_property('logfn', '/dev/null')
2015-02-13 20:38:46 -0500 received badge  Famous Question (source)
2015-02-13 13:56:28 -0500 received badge  Self-Learner (source)
2015-02-13 13:26:39 -0500 answered a question How to power off Kinect from software

I found that the command

echo '1-1' > /sys/bus/usb/drivers/usb/unbind

works, but it needs to be repeated twice in a well timed way.

What works reliably for me is the following:

echo '1-1' > /sys/bus/usb/drivers/usb/unbind; sleep 2.5; echo '1-1' > /sys/bus/usb/drivers/usb/unbind

Note that the 2.5 seconds sleep is crucial - 2 seconds will not work (too fast), 3 seconds neither (too slow). This was so on both machines on which I tested this. It seems that the sweet spot is when the Kinect USB Hub is detected, but not yet any of the actual Kinect devices:

[ 3451.492033] usb 1-1: new high-speed USB device number 109 using ehci-pci
[ 3451.624348] usb 1-1: New USB device found, idVendor=0409, idProduct=005a
[ 3451.624351] usb 1-1: New USB device strings: Mfr=0, Product=0, SerialNumber=0
[ 3451.624860] hub 1-1:1.0: USB hub found
[ 3451.624970] hub 1-1:1.0: 3 ports detected

When troubles finding the sweet spot, you can just "bruteforce" it, as follows:

for (( i=0 ; i < 10 ; i++)) ; do echo '1-1' > /sys/bus/usb/drivers/usb/unbind; sleep 0.5; done

Finally, to re-enable:

echo '1-1 > /sys/bus/usb/drivers/usb/bind
2015-02-09 12:23:35 -0500 commented answer How to power off Kinect from software

... which is in line with what @MichaelKorn reports.

So it seems to make sense to try disabling the kinect completely - as that would save about 0.38A

2015-02-09 12:23:35 -0500 received badge  Commentator
2015-02-09 12:22:29 -0500 marked best answer How to power off Kinect from software

I have a Kinect connected to the onboard Linux-based PC on my robot. However, in many scenarios I am not using the Kinect, so I would like to be able to turn it off from command line to save power.

I tried the following:

echo '1-1' > /sys/bus/usb/drivers/usb/unbind

This works, in the sense that the power light of the Kinect goes off for a moment, and the USB device is removed. However, the Kinect gets redetected within a second - re-enabling it.

Following results in the same behaviour:

echo '1' > /sys/bus/usb/devices/1-1/remove

Note that I am using a 3.4 kernel, and following does not work:

echo '0' > /sys/bus/usb/devices/1-1/power/autosuspend_delay_ms" echo 'auto' > /sys/bus/usb/devices/1-1/power/control"

Is there a way to disable the Kinect from software?

Thank you!

2015-02-09 12:22:29 -0500 commented answer How to power off Kinect from software

I just detected a major flaw in my current measurement setup, and retested: Kinect connected to power supply only: 0.00 A (baseline) - Kinect connected to power supply, and to USB: 0.50A stabilizing at 0.38A - Kinect driver running: 0.38A - Kinect app running: >0.45A

2015-02-08 02:12:55 -0500 answered a question arduino robot

The package ros_arduino_bridge is probably what you are looking for if you want to combine Arduino and ROS. See also related blogpost by the author here.

Also, I suggest picking up ROS By Example - Volume 1 as a general introduction to ROS. While not focused on Arduino, it is the best ROS introduction I know (and accidentally, is written by the same author as ros_arduino_bridge).

EDIT: I should also point out that ros_arduino_bridge is basically a ROS driver for Arduino - it allows you to control your Arduino board from within the ROS framework. You will still need an additional, higher-end system to run the actual ROS nodes (Arduino hardware is not sufficiently powerful for that).

2015-02-07 22:45:38 -0500 received badge  Self-Learner (source)
2015-02-07 22:45:38 -0500 received badge  Teacher (source)
2015-02-07 03:34:41 -0500 commented question Good map with hector_mapping and with odometry-only - poor map with gmapping

@jseal interesting, thanks! Did you ever try setting the minimumScore parameter? Setting this to a very high value (see below) works very well for me.

2015-02-07 03:32:32 -0500 commented answer Good map with hector_mapping and with odometry-only - poor map with gmapping

The documentation for minimumScore ("Can avoid jumping pose estimates [...] when using laser scanners with limited range (e.g. 5m).") suggests this might be a common problem with xv11 lidars (5m range). Have other xv11 lidar users a similar experience with gmapping?

2015-02-06 23:56:27 -0500 received badge  Famous Question (source)
2015-02-06 04:44:46 -0500 answered a question Good map with hector_mapping and with odometry-only - poor map with gmapping

I did a lot of different gmapping experiments - results can be found here.

Summary of what I tried that did NOT help:

  • Decreasing linear_update and angular_update to 0.1 (from default 1.0) (Note: increasing to 2.0 works a little bit better, presumably because this means the laser scans have become more dissimilar before being processed)
  • Increasing particles to 100, 500

Summary of what did help:

I had most success with setting the minimumScore to a very high value, as suggested here, and combining that with setting srr/srt/str to 0, and stt to 0.1 (to reflect rotational error, but near-perfect translational behavior), i.e.:

rosrun gmapping slam_gmapping scan:=scan _delta:=0.1 _maxUrange:=4.99 _xmin:=-5.0 _ymin:=-5.0 _xmax:=5.0 _ymax:=5.0 _particles:=30 _srr:=0 _srt:=0 _str:=0 _stt:=0.1 _minimumScore:=10000

(Note that the maxUrange, and particles values are the defaults; the xmin/ymin/xmax/ymax just make for a smaller map, by specifying much lower starting values than the default 100m)

The result is this (pgm): image description

Note that results change across runs - sometimes results would be slightly worse. In particular, the "_srr:=0 _srt:=0 _str:=0 _stt:=0.1" seems not very significant - with all these set to 0, and particles set to 1, that is with:

rosrun gmapping slam_gmapping scan:=scan _delta:=0.1 _maxUrange:=4.99 _xmin:=-5.0 _ymin:=-5.0 _xmax:=5.0 _ymax:=5.0 _particles:=1 _srr:=0 _srt:=0 _str:=0 _stt:=0 _minimumScore:=10000

I get this (pgm):

image_description

Slightly different, but well within the variance I get by running the previous command.

Lastly, by simply using the defaults for srr/srt/str/stt, i.e.:

rosrun gmapping slam_gmapping scan:=scan _delta:=0.1 _maxUrange:=4.99 _xmin:=-5.0 _ymin:=-5.0 _xmax:=5.0 _ymax:=5.0 _minimumScore:=10000

I get this (pgm):

image_description

Again slightly different, but well in line with what I get by running the first command.

So in summary, changing minimumScore to a very large value solves my problem. (I tried 500, 750, 1000, 10 000, with 10 000 best)

So this seems to solve my issue. I am still open to better solutions though, as this seems a bit of a hack, disabling a large part of gmapping's 'intelligence'.

2015-02-06 03:01:55 -0500 commented answer Good map with hector_mapping and with odometry-only - poor map with gmapping

@AlexR, thanks I went through that answer, and in particular the reference to this answer seems promising - still testing, but will report back shortly.

2015-02-06 01:20:32 -0500 commented answer IMU ros orientation

@asusrog It is probably best to convert to the ROS orientations at the moment you put the information from your imu into a ROS IMU msg. If you have full control over the firmware of your imu, you could also change it there - but this might be harder to maintain.

2015-02-05 07:15:01 -0500 received badge  Notable Question (source)
2015-02-03 11:45:37 -0500 answered a question IMU ros orientation

As for the ROS axis convention, have a look at REP 103. Moreover, ROS has defined the IMU msg format to hold data from an IMU.

2015-02-03 11:22:23 -0500 received badge  Popular Question (source)
2015-02-01 16:18:34 -0500 received badge  Student (source)
2015-02-01 12:40:25 -0500 received badge  Editor (source)
2015-02-01 12:39:58 -0500 answered a question speech recognition and ROS

Have look at this Pi Robot blog entry.

However, this is rather outdated, assuming ROS Electric.

I highly recommend obtaining the latest copy of ROS by Example - Volume 1, which has a chapter dedicated to this ("9. Speech Recognition and Synthesis") - see Table of Contents.

2015-02-01 10:33:09 -0500 edited question Good map with hector_mapping and with odometry-only - poor map with gmapping

Using an XV11 lidar and calibrated odometry, I am able to build a good quality map using both hector_mapping (using scan results only) and pure odometry (using odometry only). However, I am unable to get a good map using gmapping (which combines scan results and odometry).

Odometry:

My robot has odometry that passes the tests listed in the navigation tuning guide - linear translation is perfect, rotation error seems to be reasonable as well.

Lidar:

Additionally, it is using an XV11 lidar as laser scanner, with the xv_11_laser_driver package.

Bagfile:

I've recorded a short bag file with /odom, /tf and /scan topics, using following command:

rosbag record -O second_data /scan /tf /odom

I then compressed the bagfile (rosbag compress second_data.bag), and put it available here.

Map results:

If I transform this bag file into a map using hector_mapping, the result looks perfect: image description

EDIT: Here are links to the hector_mapping generated pgm and yaml files.

Moreover, if I simply display the laserscan results in rviz in the /odom frame, with an infinite decay time, I get a very nice map as well: image description

Now, when I try to run this through gmapping, the map looks very poor: image description

EDIT: Here are links to the gmapping generated pgm and yaml files.

Commands:

To generate the gmapping map, I used:

rosparam set use_sim_time true
rosparam set slam_gmapping/delta 0.1  #set resolution to 0.1
rosrun gmapping slam_gmapping scan:=scan
rosbag play second_data.bag --clock
rosrun map_server map_saver

To generate the hector_mapping map, I used:

rosparam set use_sim_time true
roslaunch hector_mapping mapping_default.launch
rosbag play second_data.bag --clock
rosrun map_server map_saver

Launch file is available here.

PDF graph of tf tree is available here.

Question:

How come gmapping gives such poor results, compared to the two other approaches? Any suggestions to improve the gmapping results would be much appreciated.

Thank you,

Kristof