ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2022-04-06 08:30:41 -0500 | received badge | ● Good Answer (source) |
2021-09-30 19:03:35 -0500 | commented answer | Is there a way to merge bag files? There's an interesting side effect of this script. The resulting merged bag is much larger than the sum of the two origi |
2021-09-21 14:34:29 -0500 | commented answer | Extracting definition of customized messages from bag files This should totally be in the ros_bag utility. Awesome! |
2021-09-10 18:17:12 -0500 | received badge | ● Nice Answer (source) |
2021-06-03 22:52:12 -0500 | received badge | ● Taxonomist |
2021-02-19 23:24:40 -0500 | received badge | ● Necromancer (source) |
2021-02-19 12:55:50 -0500 | answered a question | How to use ROS in Colab? I built on the answer above and added a run script that sets the ROS_MASTER_URI and sources the ROS environment. It has |
2021-01-29 12:19:47 -0500 | received badge | ● Good Question (source) |
2019-11-21 05:38:48 -0500 | marked best answer | tf2 Documentation Needs Work As we get ramped up on our own ROS environment, we've found the tf2 implementation extremely hard to understand. So this is a bit of a plea for folks who understand it well, to improve the tutorials, provide implementations of complete tf2 framework doing useful things, and a more up-front explanation of how it is implemented, why it is implemented that way and meant to work. Let me say at the outset, this is meant to be wholly constructive. The amount of work that has gone into these packages and their documentation is extraordinary and we are wholly thankful for all of it. When we have a better understanding, we'll be in a better place to make contributions ourselves. But until then, we're asking for some help. The concepts of reference frames in general, the REPs for ROS conventions, and the need for them is clear enough. The rub is in exactly how one implements tf2 to provide utility in transforming a point, pose, vector, or arrays of any of these, between any two arbitrary reference frames quickly and seamlessly. The basic tutorials here provide a nice gentle introduction to the mechanics. But I think the broadcaster example is confusing to first-time users in that it does not clearly introduce the tf2 library along the lines of the other turtle_sim tutorials. For example, rather than creating a transform from a pose message with little explanation as to why, it might be clearer, to set the problem up a bit first. Explain that turtle1 will be driven around using key-board commands and its location will be calculating using odometery and reported as a Pose message in the odom frame. Then, the broadcaster is implemented explicitly to define the transform between the odom and base_link frames. The Listener is also confusing. The spawning of a new turtle simulation deserves some comment, as it is not done in most of the other tutorials. Also, the process does not explicitly define the reference frame for the position of turtle2. Given that this is a tf2 tutorial, doing so would make things more clear. Finally, although it is not part of the "listening", because this is somewhat of a contrived example, some discussion of why one is listening to the published transform and creating Twist messages from the result. Driving one turtle based on the pose of another through broadcast of a transform is just not a model that most, if anyone, would ever use, and for first timers it needs some comment. Finally, what is perhaps most confusing and frustrating is that the tutorial doesn't actually provide an example of how to transform a Pose, Vector, Point or anything else having a reference frame into its equivalent in another reference frame. Since this is the whole point of the tf2 package it is a HUGE omission. Understanding how to do this, how to optimize it for large data sets, how to troubleshoot it - all important pieces. There are some other ... (more) |
2019-11-20 03:37:14 -0500 | received badge | ● Favorite Question (source) |
2019-07-09 10:43:04 -0500 | received badge | ● Popular Question (source) |
2019-07-09 10:43:04 -0500 | received badge | ● Notable Question (source) |
2019-07-09 10:43:04 -0500 | received badge | ● Famous Question (source) |
2019-03-05 05:07:30 -0500 | received badge | ● Famous Question (source) |
2018-09-19 14:25:07 -0500 | marked best answer | PID Control and the Navigation Stack Hello, I'm trying to learn how to properly set up a standard ROS navigation and control stack to drive a field robot (one that operates outdoors, navigating primarily by GPS and possibly a compass). My own previous experience with other systems keeps getting in the way and so I have some questions about how this is meant to work. 1) This system is not a wheeled system, (it's a boat) and so there's not a direct measure of "odometry", which move_base seems to require. Even with speed measured by the GPS and heading measured by a separate compass, the combination don't typically reflect the actual movement of the vessel, since the vessel's direction of travel rarely matches it's heading. The speed from the GPS is course made good, not speed in the vessel's reference frame. My questions is, how should one best set this up? 2) I understand the canonical output from move_base (from the local_planner specifically) is the Twist message, containing linear and angular velocity information. Having been told the robot's position, given a local cost map and a goal, the local planner calculates the optimal velocity message for the vehicle. What is confusing to me is that the output of all this. What the robot needs (it seems to me) is a direction to travel and a speed at which to go. But a Twist message outputs not a desired direction of travel, but the angular velocity or the change in direction of travel. It seems to me move_base is acting like a controller (think PID controller) on some level by not specifying the desired direction of travel directly. Do I understand this correct? 3) Finally, ros_control can implement a PID (or other) controller to adjust the speed of motors (or maybe maneuver a rudder on a boat) to match the desired setpoint. But in this case, it is matching a heading rate, not a heading directly. Is this not backwards? Shouldn't the output of the local_planner be a desired heading and since the motors, driving wheels or a rudder, actually change the heading rate directly, a PID controller in ros_control would then take adjust the heading rate to achieve the desired heading published by the local planner? I hope this all makes some sense. I can't help but feel like the roles of the local_planner and ros_control are reversed in some sense (at least with respect to the desired direction of travel). Thanks for your thoughts, in advance. Val |
2018-09-19 07:21:06 -0500 | received badge | ● Nice Answer (source) |
2018-06-19 22:20:47 -0500 | received badge | ● Notable Question (source) |
2018-05-30 13:18:26 -0500 | received badge | ● Popular Question (source) |
2018-05-30 10:10:14 -0500 | received badge | ● Nice Question (source) |
2018-05-30 07:01:00 -0500 | asked a question | tf2 Documentation Needs Work tf2 Documentation Needs Work As we get ramped up on our own ROS environment, we've found the tf2 implementation extremel |
2018-04-26 07:49:57 -0500 | received badge | ● Nice Answer (source) |
2018-04-24 10:34:24 -0500 | received badge | ● Famous Question (source) |
2018-04-12 21:01:23 -0500 | commented answer | Where's the official Python API documentation for TF? This discussion seems to revolve around where the existing documentation can be found. But it doesn’t, in my opinion rea |
2018-01-17 03:28:26 -0500 | received badge | ● Notable Question (source) |
2018-01-17 03:28:26 -0500 | received badge | ● Famous Question (source) |
2018-01-17 02:22:10 -0500 | received badge | ● Famous Question (source) |
2018-01-10 18:48:52 -0500 | received badge | ● Necromancer (source) |
2017-12-03 18:33:06 -0500 | commented answer | convert the yaw Euler angle into into the range [0 , 360]. This answer was some time ago, but is this not incorrect? PEP 103 specifies that yaw measured from the x-axis (i.e. in |
2017-10-18 16:17:29 -0500 | answered a question | Is there a general way to convert ROS messages into JSON format? Rosbridge is probably the better answer, but just in case, here's a method that will at least work on messages whose con |
2017-08-31 20:23:44 -0500 | received badge | ● Popular Question (source) |
2017-08-31 14:24:21 -0500 | marked best answer | ros with multiple distros I have a 3rd party gadget whose ROS interface is compiled against an older version of ROS. I'm curious, to what extent is it possible to intermix clients and ROS Master from different distributions. Is there any compatibility between them? Thanks, Val |
2017-08-31 14:22:46 -0500 | marked best answer | Indigo and Kinetic Compatibility Hello, I'm (perhaps foolishly) integrating a Kinetic ROS install with a third-party package whose implementation is built on Indigo. In initial tests with clients built on Kinetic I've been able to read every message published by nodes built on Indigo. However I am unable to subscribe to and read messages published by nodes built on Kinetic by clients built on Indigo. Can anyone verify for me, is this a fool's errand? Is ros_comm as built in Indigo sufficiently different that is not fully compatible with Kinetic. There is precious little documentation regarding the compatibility between ROS versions so it is hard to tell without testing. Thanks! |
2017-08-31 14:22:39 -0500 | answered a question | Indigo and Kinetic Compatibility I discovered my problem here. I did not realize that ROS requires fully resolvable host-names in addition to routable ip |
2017-08-30 15:06:32 -0500 | commented question | Indigo and Kinetic Compatibility Thanks for reminding me that I'd asked the question. Must be frustrating. :) Now I'm actually trying it. I'm defining th |
2017-08-30 14:51:57 -0500 | received badge | ● Notable Question (source) |
2017-08-30 12:07:59 -0500 | asked a question | Indigo and Kinetic Compatibility Indigo and Kinetic Compatibility Hello, I'm (perhaps foolishly) integrating a Kinetic ROS install with a third-party pa |
2017-08-11 20:19:55 -0500 | received badge | ● Self-Learner (source) |
2017-08-11 14:34:31 -0500 | marked best answer | Sentence.msg Header Missing I'm trying to use the nmea_msgs/Sentence message and I'm stumped on what is probably a python problem. I've built the package and I can see the results in devel/ And it shows up fine with rosmsg: But if I source devel/setup.bash and then run ipython, and then import the module I don't actually get a proper Sentence class. For example: Although there is a "header" object, it is not a "Header" object in the sense of a ROS message definition. That is, it has none of the attributes of 'stamp', 'seq' and 'frame_id' for the Header message. I cannot figure out why. -Val |
2017-08-11 14:34:20 -0500 | answered a question | Sentence.msg Header Missing I found the problem!!! I should have defined the message in iPython like this: a = Sentence() Thanks! |
2017-08-11 14:21:12 -0500 | asked a question | Sentence.msg Header Missing Sentence.msg Header Missing I'm trying to use the nmea_msgs/Sentence message and I'm stumped on what is probably a pytho |
2017-07-06 07:27:44 -0500 | received badge | ● Great Question (source) |
2017-05-19 08:01:19 -0500 | received badge | ● Favorite Question (source) |
2017-05-02 10:18:10 -0500 | received badge | ● Notable Question (source) |
2017-04-06 17:35:40 -0500 | received badge | ● Popular Question (source) |
2017-03-27 09:24:13 -0500 | marked best answer | REP-105 and robot_localization Hello, I have a series of questions about REP-105 and the robot_localization package. What is confusing to me is the utility of the odom reference frame and it is possible that I don't understand its definition. REP-105 describes the odom reference frame as a world-fixed reference frame. But it also says that the pose of a robot can drift in the odom reference frame without any bounds. I understand the problem. A vehicle whose position is estimated from dead-reckoning alone will accrue position estimate errors that will grow unbounded unless checked by an absolute position measurement. So is the idea that one can track the dead-reckoned position estimate and the error between that and the true position separately? Perhaps this is done in the transform between the map frame and the odom frame? What is the utility of the odom reference frame vs the map reference frame? Why is having a continuous "no-jumps" reference frame better than having a globally accurate one? If one is using an EKF and there are jumps in the positioning due to GPS data, does that not mean that the measurement covariance (or possibly the process covariance) are not set properly? It seems to me one is relying on the GPS too much. Maybe it means the IMU is too poor quality to adequately filter the GPS? I guess what I'm getting at is that it seems to me that a properly modeled and tuned EKF should (barring resets) not have jumps and the odom frame might be unnecessary. I can't wait to hear your thoughts! |