ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Tadhg Fitzgerald's profile - activity

2021-07-27 10:35:44 -0500 received badge  Taxonomist
2015-11-26 03:03:54 -0500 received badge  Famous Question (source)
2015-10-16 22:00:29 -0500 received badge  Famous Question (source)
2014-12-18 11:24:21 -0500 received badge  Nice Question (source)
2014-03-18 07:46:14 -0500 received badge  Notable Question (source)
2014-03-18 07:46:14 -0500 received badge  Popular Question (source)
2014-01-28 17:29:16 -0500 marked best answer What's the best way of dealing with the Kinects "blind spot"?

I'm wondering what the best way of dealing with the minimum scan distance limitations of the Kinect (i.e within ~0.6m it can't detect objects). Specifically I'm attempting to use explore and slam_gmapping packages to search a room. This works when the robot starts in the center of the room (or at least more than 0.6m from a wall), however if the robot starts in a hallway where the walls are closer, the map and by extension exploration goals get completely messed up.

I realise this can't be fixed as it's a hardware limitation, but what are good parameter values to minimise the effect on the map/navigation? I've found setting the slam_gmapping param "maxUrange" to 6.0 helps, are there any others I should look at?

As a follow up question, I'm using a LEGO Mindstorms NXT kit for the robot. At the moment I'm just using the two motors that come with the kit as a base, would it be possible to add the sonar sensor to compensate for the Kinects blind spot in some way?

2014-01-26 23:51:44 -0500 received badge  Notable Question (source)
2013-06-21 04:38:06 -0500 received badge  Famous Question (source)
2013-04-26 02:20:18 -0500 received badge  Famous Question (source)
2013-04-08 02:08:46 -0500 received badge  Popular Question (source)
2013-04-06 12:49:23 -0500 received badge  Notable Question (source)
2013-04-05 08:23:06 -0500 received badge  Popular Question (source)
2013-04-03 06:44:22 -0500 received badge  Famous Question (source)
2013-03-21 17:06:00 -0500 received badge  Notable Question (source)
2013-03-20 21:05:56 -0500 received badge  Popular Question (source)
2013-03-19 12:41:47 -0500 asked a question How do you cancel a navigation goal?

I'm trying to cancel a move_base goal sent by the explore node. SimpleActionClient.cancel_goal() doesnt seem to work at all, SimpleActionClient.cancel_all_goals() only works intermittently. I've included the code for my goal cancellation node below:

class cancelGoal():
    def __init__(self):
        self.node_name = "cancel_goal"

        rospy.init_node(self.node_name)

        self.explore_goal_sub = rospy.Subscriber("/move_base/goal", 
                                    MoveBaseActionGoal, self.newGoalHandler)
        self.client = actionlib.SimpleActionClient('move_base', MoveBaseAction)
        rospy.loginfo("Wating for move_base server.....")       
        self.client.wait_for_server()

    def newGoalHandler(self, new_goal):
        self.client.cancel_goal()
        self.client.cancel_all_goals() # Try both
        rospy.loginfo("Goal cancelled")
2013-03-19 08:16:04 -0500 commented question Groovy Python node cannot find move_base_msgs module

Sorry I can't help. ROS by example is a great book though, just wanted to say thanks, helped a lot with my final year project:)

2013-03-19 05:55:57 -0500 asked a question Overriding or setting precedence for a navigation goal?

If I have two different nodes sending MoveBaseActionGoal messages to a move_base node, how do I specify which should get precedence? Is it possible to override one goal with another?

2013-03-19 02:12:25 -0500 answered a question Any great affiliate plan to use my e-mail advertising scheme

Where's the spam button?

2013-03-19 02:10:41 -0500 received badge  Critic (source)
2013-03-18 21:47:59 -0500 asked a question Altering exploration goals on the fly.

I'm making a search robot that uses the Explore stack to send goals to move base. My question is this, is there a way of intercepting these goals and changing/dropping them occasionally? Let me explain, as my robot is moving about it's seeing things. 95% it's seeing things which are of no interest to it, in this case the explore node's MoveBaseActionGoal should just be passed on as is. When the robot sees something of interest to it I would like some way of pausing to examine it(drop goals) or move towards it for a closer look(change the goal). I assume what I need to do is set up some kind of interceptor node which subscribes to the MoveBaseActionGoals from explore then relays these on to move_base but when I try this I get the warning "[ WARN] [1363678197.929778870]: Waiting to connect to move_base server" from the explore node. Any ideas?

2013-03-01 03:32:28 -0500 received badge  Famous Question (source)
2013-02-21 00:13:14 -0500 received badge  Famous Question (source)
2013-02-20 09:42:09 -0500 received badge  Good Question (source)
2013-02-20 04:29:13 -0500 received badge  Notable Question (source)
2013-02-20 04:25:56 -0500 commented answer Lower Kinect resolution for RGB/Camera using openni_launch

Thank you very much for the in depth answer!

2013-02-20 04:25:21 -0500 received badge  Scholar (source)
2013-02-19 21:45:43 -0500 received badge  Popular Question (source)
2013-02-19 11:27:52 -0500 asked a question Lower Kinect resolution for RGB/Camera using openni_launch

Is there any way of lower the Kinect RGB image resolution from 640 * 480 to 320 * 240 when using the openni_launch package? I know it can be done in the openni_camera launch file by changing the image mode but I cant use this because depth doesn't show when I do for some reason. The reason I need to lower it is because I'm using cv_bridge to opencv but with the higer resolution this part runs unacceptably slowly.

2013-02-19 10:22:46 -0500 commented answer What's the best way of dealing with the Kinects "blind spot"?

Great thank you, I'll have a snoop about in those:)

2013-02-19 05:22:02 -0500 received badge  Nice Question (source)
2013-02-18 23:44:21 -0500 received badge  Notable Question (source)
2013-02-18 23:29:11 -0500 received badge  Teacher (source)
2013-02-18 21:17:19 -0500 edited answer read cmd_vel commands from a text file

I haven't worked with turtlebot or gazebo but the answer is almost certainly yes. The basic idea is that you want to read your text file line by line as you normally would in python/c++. Split each line into different values you want then publish these programmatically as Twist messages as explained in this tutorial http://pharos.ece.utexas.edu/wiki/index.php/Writing_A_Simple_Node_that_Moves_the_iRobot_Create_Robot#Create_the_Program. You should do all of this work inside a node which you start after your robot has started. Hope this helps.