ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

davo's profile - activity

2018-09-13 13:52:55 -0500 received badge  Taxonomist
2014-01-28 17:22:44 -0500 marked best answer gmapping slam map shrunk

https://docs.google.com/drawings/d/1e4BW-T9eKJRz9DEIlpouFsAMtXd5bdWstPRDinUFEM0/edit?hl=en_US

My maps are shrunk.....:( I've been working on this for a while > week

This is fairly typical of what is happening, except this time the whole map is shrunk. It starts off ok, the first few metres look great then there's a scan a match and a portion usually the top north west end of the corridoor shrinks from almost 2.70 to less 2.0 and the whole map starts to change.

Been playing with slam options not much joy.

I posted a screen grab on google docs as not only is my flat shrinking I have no karma either.....what a day!

I'm using cturtle, on arm lucid and a base PC under maverick

I have a small issue with magnetic influences as I'm using a compass, but as you can see from the scans fairly straight lines along the corridoor by hook or crook but the compass error is less +- 5 degrees

odom looks ok, error is about 10cm over 8.3 M doing the trig on the broadcast odom

The laser ( ummm sorry, a PML sharp 20-150 cm ir range finder on a 180 degree 1 degree step servo) The laser data looks reasonable to me. I'm scanning every 40 to 50 cm from a halted position

I think there may be some issues with the laser going into and out of shadows which I can work it seems to cause a sweep at the longer range of a wall at an acute angle.

The sharp it's fairly accurate about +-1cm < 100 1.0 -1.5 about +- 2.5 but often better, of course depends on the surface angles etc....that's with filtering and curve fitting not the typical linearisation technique

I have a small issue with magnetic influences as I'm using a compass, but as you can see from the scans fairly straight lines along the corridoor

The features are ok in the undecayed scan points

I've tried lots of slam options, not sure about how to guestimate stt str srt srr or actual what they are.

output of gmapping slam

 -maxUrange 1.5 -maxUrange 1.5 -sigma     0.05 -kernelSize 1 -lstep 0.05 -lobsGain 3 -astep 0.05
 -srr 0.1 -srt 0.2 -str 0.1 -stt 0.2
 -linearUpdate 0.02 -angularUpdate 0.5 -resampleThreshold 0.5
 -xmin -25 -xmax 25 -ymin -100 -ymax 25 -delta 0.05 -particles 120
[ INFO] [1314266203.037209866, 1314264197.070439983]: Initialization complete
update frame 0
update ld=0 ad=0
Laser Pose= -0.00781462 0.0184101 1.97222
m_count 0
Registering First Scan
update frame 1
update ld=0.539888 ad=0.174533
Laser Pose= -0.118015 0.546932 1.79769
m_count 1
Average Scan Matching Score=89.0036
neff= 118.987
Registering Scans:Done
update frame 2
update ld=0.519974 ad=0.0174533
Laser Pose= -0.248544 1.05026 1.81514
m_count 2
Average Scan ...
(more)
2014-01-28 17:22:00 -0500 marked best answer python tutorials for robot setup tf and sensors

Hi,

Very new to ROS.....please excuse

core server navigation visualisation etc maverick 2.6.35-28-generic ros-cturtle-core (synaptics version) 1.0.0-s1302849017~maverick

robot node ubuntu-10-04 architecture armel ( beagle board xM ) ros cturtle compiled on board from svn

Outline Was planning to implement ROS via python to control.read data from personal arduino robot. My arduino is working well so I do n't really want to start integrating other's libraries into it at the moment.

Progress Working through the tutorials and most work, transform listener/broadcaster mostly work, but the introductions to time fail.

Now trying to work through the c/cpp tutorials for robot setup and laser sensor.

http://www.ros.org/wiki/navigation/Tu...

http://www.ros.org/wiki/navigation/Tu...

Issue

I really want to work in python if I can, are there tutorials for these or do you have to setup in C and then use python in a restricted manner?

Thanks

Dave

2014-01-23 03:19:07 -0500 marked best answer Basic Question Mobile Robot cmd_vel

I'm on cturle, base is maverick and robot node is arm lucid and cturtle with an arduino as the actual robot controller.

My physical robot is a small cheap 4 wheel drive using skid steering,

I'm limited physically so my mobile robot is is the apartment, which is not small but tricky, with a bad floor surface for odometry and some areas's of magnetic variance....which I've got managed to a point. And some pretty tight doorways.

The robot has a compass and odometers and a sharp ir. I am using a crude gyro setup to detect where the compass goes off and the robot is not actually turning via the gyro.

But upshot is that what works best is driving to bearing and distance in straight lines and stop and skid turn to new bearing.

So I have my robot programmed from the arduino to follow waypoints. The surface magnetics motors/odometry are not upto variable speed turns on the go...I tried and lost that one.

From a ROS perspective I've got odom and laser scan working and sending the fixed transform base link -> base scan

So manually driving the arduino around things look quite good in rviz.

I have n't started mapping yet as I'm following the tutorials carefully.

I'm just writing (plagerising) a joystick teleop package and starting to hit some fundamentals. For teleop I can interface quite easily to the beagle/ board arduino, but I'm concerned I'm on the wrong path for move_base cmd_vel base_controller

Could I please just make sure I understand the output of cmd_vel.....

x and z I'm taking it that x is velocity and z is rate of angle change or is this x distance and bearing......

If the former is correct (what I fear) velocities how do I obtain/convert manage this over site in my work? waypoint data and not screw the nav stack ? The arduino side of my robot is not dumb and has taken quite alot of work to over come some issues.

Sorry to ask this one, I've been searching for a while.

Dave

2013-05-15 01:03:24 -0500 commented question Microsoft Kinect not connected

groovy installed 12/5/2012 using ros-groovy-open_camera and launch same issue tried the serial number etc, did notice that lsusb did not list all 3 devices camera motor audio and no serial number so reconnect usb, run as root or sudo rmmod -f gspca_kinect, thanks very much

2012-11-14 07:52:59 -0500 received badge  Famous Question (source)
2012-09-17 17:05:34 -0500 received badge  Notable Question (source)
2012-09-17 17:05:34 -0500 received badge  Popular Question (source)
2012-08-24 09:55:42 -0500 received badge  Famous Question (source)
2012-08-24 09:55:42 -0500 received badge  Notable Question (source)
2012-08-23 23:48:18 -0500 received badge  Famous Question (source)
2012-08-23 23:48:18 -0500 received badge  Notable Question (source)
2012-08-20 22:08:50 -0500 received badge  Famous Question (source)
2012-08-03 09:02:48 -0500 received badge  Famous Question (source)
2012-07-27 04:31:29 -0500 received badge  Notable Question (source)
2012-06-18 23:51:32 -0500 asked a question any issues with amd bulldozer?

hi, I really did not want to ask this question but the last thing I want at the moment is get another set back.

I am running a small mobile robot with a kinnect. I'm moving as much processing off board as I can.

The previous desktop was a bit challenged, a lower end athlon 2.

my desktop blew up, so quickly acquiring a new one. options are ivybridge i5-3450 or amd fx 8120 the linux benchmarks seem to favour the 8120 unlike windows. however the intel is a safe bet! the amd offers some potential but at a higher risk.

my question is

  • anyone having issues with amd bulldozer processors installing, compiling or running ros?
  • Performance ?

I am aware that the 8120 has a shared fpu on each module and I am using mixed float sizes ( i am using python). The use or lack of amd specific compiler optimisations affecting performance and results. the lack of maturity and testing within ROS.

Thanks and again sorry to ask

Dave

2012-03-29 07:22:28 -0500 received badge  Notable Question (source)
2012-03-20 19:33:53 -0500 received badge  Popular Question (source)
2012-02-10 11:37:00 -0500 received badge  Popular Question (source)
2012-02-03 01:54:37 -0500 received badge  Popular Question (source)
2012-01-25 22:55:16 -0500 received badge  Popular Question (source)
2011-11-22 05:22:54 -0500 marked best answer Easier way to launch nodes and roslaunch

I think a good way for you to do this is to set up some bash scripts that execute roslaunch in screen.

One way I know of doing this is to have a script that starts up some essentials (maybe roscore and your core drivers).

So your main bash script would include something like:

screen -d -m -S roscore ~/scripts/start-roscore.sh
screen -d -m -S robotcore ~/scripts/start-robot.sh

And the start-roscore.sh script could look like:

#!/bin/bash
source /opt/ros/electric/setup.sh
export ROS_PACKAGE_PATH=~/where-you-keep-stacks:$ROS_PACKAGE_PATH
roscore

And the start-robot would look like:

#!/bin/bash
source /opt/ros/electric/setup.sh
export ROS_PACKAGE_PATH=~/where-you-keep-stacks:$ROS_PACKAGE_PATH
roslaunch {packagename} {launch file}

Then these will be persistent inside of screens. To get a list of the available screens, type:

screen -ls

To attach to a screen (for instance robotcore):

screen -r robotcore

Then to keep it running, but detach from the session, you can press:

{ctrl and a - at the same time: then press d}

If you could configure the main bash script to start from clicks, it wouldn't matter if it exited right away, as those screens will still be running.

I hope this is what you needed.

2011-11-20 20:37:58 -0500 received badge  Nice Question (source)
2011-11-20 20:34:28 -0500 commented answer map_saver doesn't return from "Waiting for the map"
hi, I was doing this on a custom bot start off with basics, timesync! Ensure all your nodes/topics/tf's are named properly esp laser scan . follow the diagnostics for tf. i was dropping odom stationary/wireless range/usb b/w. The tf diags/trouble guides very good, check logs...good luck
2011-11-20 16:23:48 -0500 commented answer Easier way to launch nodes and roslaunch
I moved to another machine recentl and fresh...... no more yaml errors looks lie pythin is well stuffed help('modules') chucking lots of errors . Using your scripts as a framework making good progress. Cracking answer wish I'd got this much earlier. Thanks heaps!!! Dave
2011-11-20 16:17:36 -0500 received badge  Supporter (source)
2011-11-20 13:36:00 -0500 commented answer Easier way to launch nodes and roslaunch
chad, thanks for the prompt answer, I got screens working for some of the simple scripts, but failing on some of the more complex getting python traces from rosh roslaunch etc seems to be a problem with yaml.....or py yaml, so looking at that as we speak. I found terminator as well.....I likes that!!!!, just keeping down the mouse clicks etc
2011-11-20 13:36:00 -0500 commented answer Easier way to launch nodes and roslaunch
woops duplicates......sorry
2011-11-20 11:06:09 -0500 asked a question Easier way to launch nodes and roslaunch

hiya, I have quite a disabilty and it's getting worse, I can only spend very short periods at the computer before I get bricked. I seem to be spending what good time I have typing typos into consoles and then trying to catch them.......so have to make my life easier.

I'm trying to keep my life easy and using python, as much as possible as C etc blow my mind as I get more affected.

I'd like to reuse my packages nodes launch files inside a shell or gui menu and reorder select deselect as I work through my ros issues......suppose selective ordered rosrun blah blah and roslaunch....or a simple text file.

I've tried a few things and getting mixed results, bombing out to shells, ubuntu desktop launchers python scripts and getting mixed results.

Simple nodes seem ok like static transforms, but with things more complex the roslaunch node/topic gets lost orphaned somewhere, is unreliable or flags a can't find roslib issue..... but works direct from command line.

I've been using the source /opt/ros/.../setup.sh etc for the shells and #! etc and can see the env settings.

I do not need to automate everything, but just automating or point click more would make my life so much easier.

I am just having a look at rosh, and quite like the ease of scripting, but not sure where this is leading me, another option I have seen are the testing/unit test modules but again where am I going ?

I've got some pyside sbutton clicks stuff working as well but again only simple nodes.

So if anyone could point me in the right direct I would much appreciate it! I'll be using voice commands soon but not sure how to roslaunch stereo processing etc....from the command line

Thanks

Dave

2011-09-15 14:03:09 -0500 commented answer vslam, Running on Your Own Data
I've hacked the pcl_to_scan for stereo web cam and got scan msgs now need to align range points with minoru fov . not very stable yet pc resource issue I think. using the elektron_kin pkg need to watch the tf's and the filter openni kinect z is height minoru stereo z is depth
2011-09-13 02:34:33 -0500 commented answer vslam, Running on Your Own Data
Jose as you have just done a custom conversion could you point me in the right direction, i'm going around in circles as I'm not using a kinect and the tf nodelets etc are jjust confusing me! Dave
2011-09-13 02:31:57 -0500 commented answer vslam, Running on Your Own Data
Laserscan is msg with range start fin and inc angles and time stamp, it has no pose but ref to parent tf frame. u need odom and correct tf tree for range sensor to work with gslam and nav. Not nailed pclouds down yet but expect similar. I was hoping vslam help with poor odometry alas!
2011-09-04 01:05:02 -0500 answered a question gmapping slam map shrunk

I got side tracked realising that an ir distance sensor is not man enough for the job, so I'm looking at stereo web cams and pointcloud to laserscan conversions to overcome the lack of sensor data, and that also unties from the arduino for laserscan messages.

The coreslam does seem to be better with the pml, and it's highlighted the areas of magnetic variations. I am hoping that with a camera setup to be able to fuse the odometry a bit better either via improved odometery or some form of feature recognition even if I have to coloured label on the wall. The areas of magnetic variation are 10 degrees in 1 cm so it needs to be precise.

Dave

2011-09-04 00:17:40 -0500 answered a question vslam, Running on Your Own Data

woops, can get pointcloud from steroe webcams via stereo image proc

I'm just working my way through the topics and trying to set them up....?

Dave

2011-09-03 21:31:18 -0500 answered a question vslam, Running on Your Own Data

thankyou, for the question and answer been going around in circles for some time on the same issues.

Seems that many are on the same path. I'm having trouble piecing the stages required to gmapping from video.

video, point cloud to laser scan

you need to create point clouds from the video stream, and then convert pointcloud to laserscan message type for gmapping gslam

I know that vslam creates a point cloud, and that eg from the Konnect there is a nodelette that works with a frame throttler to convert point_cloud to laser_scan nodelettes and there seems to be some variations,

In the elektron stack/packages there is a variant launch file with filters for depth (range) etc.

There seems to be a fair amount of remapping and it does not make much sense...

Hope that helps a bit

Dave

2011-08-26 06:20:19 -0500 answered a question gmapping slam map shrunk

i upgraded a system to diamondback and been testing coreslam which somewhere there was a hint it may be more suitable for the "PML", The first half of the map is looking good, after tweaking/reducing the theta and sigma. I'm using bag files.

The error in the second half may also be due to the sweeps which may be magnetic variance so I'll try to adjust for that in the odom as well.

Tomorrow when I get myself together I'll run the robot again at short intervals between sweeps.

Thanks

Dave

2011-08-25 23:51:04 -0500 answered a question gmapping slam map shrunk

thanks....

I'll give that a go, I've got a stereo cam on order but it would be nice to get the "pml" mapping just as a win.

Dave