ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2020-03-03 14:02:08 -0500 | received badge | ● Famous Question (source) |
2019-06-13 11:18:07 -0500 | received badge | ● Notable Question (source) |
2019-06-13 11:18:07 -0500 | received badge | ● Famous Question (source) |
2017-04-13 11:52:20 -0500 | received badge | ● Nice Question (source) |
2017-04-13 11:52:13 -0500 | marked best answer | Smooth motion between waypoints in ROS navigation Hello all, I have an application where I am using the turtlebot simulator with move_base for some navigation related tasks. So generally, if a goal is specified, I am assuming the nav stack comes up with a spline for a large set of points that when passed through, take the robot there. If I have my own (sufficiently large) set of waypoints; is there a way to avoid stopping at every waypoint and rather let the robot drive through them like a curve? Thanks, Sai |
2017-02-07 19:57:12 -0500 | received badge | ● Great Answer (source) |
2016-07-22 15:35:30 -0500 | received badge | ● Famous Question (source) |
2016-04-27 01:40:09 -0500 | marked best answer | iRobot Create + Hokuyo LIDAR + YAML map + AMCL Hello, I have a PGM/YAML map generated from hector SLAM from a Hokuyo LIDAR driven around by an iRobot Create. I was trying to use AMCL for localization and simple 2d nav goals, but it seems to be very problematic for some reason. The brown pkg provides the /odom and /tf, and I use map_server to publish the /map topic. I ran the hokuyo_node (laser scans published on /scan) and then rosrun amcl amcl.
No laser scan received (and thus no pose updates have been published) for 1367364115.808416 seconds. Verify that data is being published on the /scan topic. MessageFilter [target=/odom ]: Dropped 100.00% of messages so far. Please turn the [ros.amcl.message_notifier] rosconsole logger to DEBUG for more information.
If I go through hector_slam, it looks like it's estimating the /odom topic from the laser scan itself, but there's no localization feature. Any suggestions as to what might need to be changed to make AMCL work with the Create's odom/tf on a known map? Thank you Sai |
2016-04-21 16:37:30 -0500 | received badge | ● Popular Question (source) |
2016-04-21 16:37:30 -0500 | received badge | ● Notable Question (source) |
2016-03-04 08:52:09 -0500 | received badge | ● Notable Question (source) |
2016-01-26 00:36:28 -0500 | asked a question | Approximate Time policy compiles for two topics of same type, but fails for topics of different types Hello, I am trying to synchronize the callbacks for two topics, one of them, a geometry_msgs::PoseStamped message that acts as a signal for image capture from another computer (just using PoseStamped as a test case for now because it comes with a timestamp, will write own message type later on), and then the actual image topic of type sensor_msgs::Image. This is the code snippet that's doing the approximate time synchronization. This code fails to compile with a lot of errors, mostly relating to boost and the Approximate Time sync_policies statement. An excerpt: But this same structure works perfectly, if both topics are sensor_msgs::Image. I am not getting what I'm doing wrong here, because geometry_msgs::PoseStamped is a message that does contain a timestamp. |
2016-01-25 05:46:54 -0500 | received badge | ● Popular Question (source) |
2016-01-22 16:08:49 -0500 | received badge | ● Popular Question (source) |
2016-01-22 16:07:38 -0500 | asked a question | Synchronizing image capture on two machines over wireless I have two computers, each connected to a camera and on the same wireless network; and I want them to capture and log images simultaneously using ROS. If I had multiple cameras on the same machine, I would have gone for ApproximateSyncPolicy, but because this is decoupled and over wireless, I was wondering if it's a good method to subscribe to camera1 on machine1, and every time the callback for this is reached, I publish a 'signal' over another topic for machine2 to listen to. So the flow would be: image1 read on machine1 -> signal sent to machine2 -> image1 saved on machine1, image2 saved on machine2 Would this be an efficient approach, or is there a better way of doing this? |
2015-11-26 03:25:07 -0500 | marked best answer | How to use OMPL for path planning Hello, I have a PGM/YAML set map that was generated from a Hokuyo LIDAR with an iRobot Create, and the objective is to use OMPL's libraries for motion planning in this map. I have the OMPL GUI, but there seems to be no straightforward way to use user generated maps, and all the example maps are in .dae format. ROS tutorials always seemed to talk about PR2 arm navigation. I would greatly appreciate any suggestions as to how I can start off using OMPL's algorithms for navigation in my own map. Thank you. |
2015-05-20 22:46:09 -0500 | asked a question | Calculating entropy with octomap representation? Hello, I am trying to implement concepts such as entropy of a map and information gain through 3D sensing, where I am storing my map data as an Octomap. I am a beginner when it comes to concepts like this, so I have been reading up recently on information theory and how the concepts of entropy, mutual information etc. work, but I am failing to grasp how to implement something like that practically. Given a certain scene and associated Octomap data, is the whole set of cells treated as a probability distribution which then can be used to determine entropy? Or is there a pre programmed straightforward way of determining the entropy of the map at a given instant? Any suggestions/inputs would be very welcome. Thanks, Sai |
2015-05-05 07:12:01 -0500 | received badge | ● Good Answer (source) |
2015-05-02 02:42:36 -0500 | received badge | ● Popular Question (source) |
2015-04-21 15:28:48 -0500 | commented question | Ros will not launch ROS_openTLD Did you do a catkin_init_workspace and catkin_make? And then source the devel/setup.bash, or just add it to .bashrc? |
2015-04-21 15:27:18 -0500 | answered a question | Launch file doesn't work after xacro update Hi Daniel, I don't know if you've already solved this issue, but this is being caused by some comments in the xacro files which have the middle dot. A solution has been posted in the github issues section of rotors_simulator. |
2015-02-13 15:34:10 -0500 | asked a question | Obtaining joint angles from joint TFs? Hi, I am currently using openni_tracker to tele-op a robot arm using an RGBD sensor. I am able to obtain the joint positions and orientations using openni_tracker, which publishes a TF with respect to the camera frame for every joint. I am getting confused when I think about how I can convert this into relative orientation of each joint, which I can then pass to t he robot control. For each joint, I am interested in two angles: the up-down motion angle and the sideways angle, and I feel that TF is giving me way more data than what's really relevant to me. Any suggestions will be very helpful. Thanks, Sai |
2014-11-17 07:49:26 -0500 | received badge | ● Famous Question (source) |
2014-11-08 02:50:46 -0500 | received badge | ● Famous Question (source) |
2014-09-28 11:39:19 -0500 | received badge | ● Notable Question (source) |
2014-09-14 15:06:50 -0500 | commented question | Hector mapping turtlebot Did you try changing the fixed frame in rviz? |
2014-09-11 11:40:44 -0500 | commented question | Problems with move_base Here's a thought: You have 10 m/s as your max. vel in x, but 1 m/s for theta. May be it's just slow because of the difference? Are you using the turtlebot files and parameters in the nav stack? Did you tweak any of the files? |
2014-09-11 11:39:12 -0500 | commented answer | Current map provided with laser scan Hi, try this.
|
2014-09-10 12:09:19 -0500 | commented answer | depthimage_to_laserscan works with rosrun but the DepthImageToLaserScanNodelet does not work properly Can you try removing the <remap from="scan"> line? |
2014-09-10 12:02:49 -0500 | commented question | Problems with move_base So it just keeps going straight and misses the goal? |
2014-09-10 11:59:28 -0500 | answered a question | depthimage_to_laserscan works with rosrun but the DepthImageToLaserScanNodelet does not work properly It looks like you're using /depth/image_raw in your nodelet and /camera/depth/image_raw in your rosrun command. |
2014-09-10 11:57:18 -0500 | answered a question | Current map provided with laser scan If you're using a laser scanner, using hector_mapping can be easier than gmapping. Launch the mapping program and publish a 0,0,0 static transform between base_link and laser to account for your non existent odometry and you should be set.
Just my two cents. |
2014-09-10 02:17:50 -0500 | received badge | ● Notable Question (source) |
2014-09-10 01:02:09 -0500 | received badge | ● Popular Question (source) |
2014-09-09 15:31:22 -0500 | received badge | ● Famous Question (source) |
2014-09-09 15:31:22 -0500 | received badge | ● Notable Question (source) |
2014-09-09 15:30:10 -0500 | commented question | problem with gmapping If you have a Hokuyo laser, hector_slam is much better than gmapping for mapping purposes. |
2014-09-09 15:28:46 -0500 | commented question | ompl error Looks like moveit is not able to find the planning plugins properly. How was moveit installed? Did you do a ros-indigo-moveit-full? |
2014-09-09 15:17:15 -0500 | answered a question | What is the state-of-the-art in ROS navigation with dynamic obstacles? Some well used, although not technically "state-of-the-art", algorithms are implemented in OMPL, which have support for dynamic collision avoidance. It is available for ROS through moveit. |
2014-09-08 22:21:25 -0500 | commented question | *ROS Navigation Alternative Algorithms I have, yes. Octomap comes in the box with moveit, so once you set up your move_group launch files, all you need to do is edit the sensor_manager.launch.xml and add the parameters of your camera there. For more info: |
2014-09-08 21:47:38 -0500 | answered a question | Base Local Planner Configuration In base local planner, there is a file called latched_stop_rotate_controller.cpp. In that function, the stop command is implemented using LatchedStopRotateController::stopWithAccLimits() [line 108] which uses acc_lim as an argument. I think you can define your own, say, decel_lim and use that instead in this function. I am not entirely sure as to whether they should be kept the same from the code's perspective though. |
2014-09-08 21:39:59 -0500 | asked a question | Frame/TF issues with a quadrotor and MoveIt Hi all, I am trying to do 3D navigation using a quadrotor and moveit, using mavros as an interface. I am using hector_quadrotor_urdf as my robot description for MoveIt. I just had a couple of questions about the whole ROS pipeline, it would be great if someone can provide a few inputs.
Thank you, Sai |
2014-09-08 21:33:38 -0500 | commented question | *ROS Navigation Alternative Algorithms Not really sure, but I have been using moveit to perform 3D navigation using a quadrotor and its path planning, obstacle avoidance etc is quite good. |
2014-09-08 19:09:42 -0500 | commented question | *ROS Navigation Alternative Algorithms There is an A* for nav_core on github. https://github.com/ros-planning/navig... Also, if you don't mind moving away from the nav_core interface, you can use a simple urdf model for your robot and then implement MoveIt! so you can use OMPL, SBPL etc. |
2014-08-29 13:57:57 -0500 | received badge | ● Popular Question (source) |
2014-08-27 11:53:10 -0500 | edited question | octomap OGRE exception in rviz Hi, I am having the following error when I try to visualize the octomap occupancy map in rviz. I am running ROS Indigo on 14.04, and I believe 14.04 has native support for Optimus graphics, so my graphics card should be usable. rviz page says that this issue was fixed in version 0.3 though. For further information, occupancy grid is displayed well, but just the map crashes rviz. I tried reducing the resolution through octomap's launch file, but the same issue persists. Are there any known fixes to this? Thanks! |