ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2023-07-12 14:31:42 -0500 | received badge | ● Good Answer (source) |
2022-09-09 02:05:08 -0500 | received badge | ● Great Question (source) |
2022-08-02 20:36:39 -0500 | edited answer | Question to working robotics engineers about their job That's a great question, it's probably better suited for the ROS discourse page since there isn't a right or wrong answe |
2022-08-02 20:32:28 -0500 | edited answer | Question to working robotics engineers about their job That's a great question, it's probably better suited for the ROS discourse page since there isn't a right or wrong answe |
2022-08-02 20:24:35 -0500 | answered a question | Question to working robotics engineers about their job That's a great question, it's probably better suited for the ROS discourse page since there isn't a right or wrong answe |
2022-08-02 20:24:35 -0500 | received badge | ● Rapid Responder (source) |
2022-07-22 05:08:28 -0500 | received badge | ● Nice Answer (source) |
2022-06-05 08:27:02 -0500 | commented answer | How to transform point clouds from multiple Gazebo Kinects in the world coordinates? The tf frame that has the location and orientation of the kinect sensor. |
2022-05-26 11:16:11 -0500 | commented answer | How to transform point clouds from multiple Gazebo Kinects in the world coordinates? I think my answer is correct, you just need to transform the point cloud using the TF of the Kinect. |
2022-05-10 10:40:57 -0500 | received badge | ● Favorite Question (source) |
2022-03-21 09:10:14 -0500 | commented question | snap: slam-toolbox Interesting, in the second link I think it is saying that, in general snaps are not faster, but in this case it is. So m |
2022-03-21 08:44:42 -0500 | commented question | snap: slam-toolbox Are you saying nav2 will run faster when you install it using the snap package-manager? I would find that hard to believ |
2022-03-14 11:59:30 -0500 | received badge | ● Guru (source) |
2022-03-14 11:59:30 -0500 | received badge | ● Great Answer (source) |
2022-02-18 09:56:31 -0500 | commented answer | save the pointcloud from gazebo as numpy Your question doesn't ask how to subscribe to the point cloud topic, it asks how to save and convert the point cloud, im |
2022-02-18 09:45:05 -0500 | answered a question | save the pointcloud from gazebo as numpy I'm not sure about the other questions (and it would be best to post those questions separately) .npy files are for arb |
2022-02-18 09:45:05 -0500 | received badge | ● Rapid Responder (source) |
2022-02-11 12:41:02 -0500 | commented answer | 6D Pose export as PCD To answer the part of your question about saving the data for later, ros has a data logging system called bags that let |
2022-02-11 12:13:07 -0500 | received badge | ● Rapid Responder (source) |
2022-02-11 12:13:07 -0500 | answered a question | 6D Pose export as PCD There is a message specifically for pose |
2022-01-14 10:16:47 -0500 | commented answer | Is it possible to score IK solutions on MoveIt? I don't know about changing during runtime, but to configure the solver just add this to your kinematics.yaml file: sol |
2022-01-14 10:15:03 -0500 | commented answer | Is it possible to score IK solutions on MoveIt? I don't know about changing during runtime, but to configure the solver just add this to your kinematics.yaml file: sol |
2022-01-14 09:32:58 -0500 | answered a question | Is it possible to score IK solutions on MoveIt? It might be worth trying out the TRAC-IK solver instead of the default KDL solver inside MoveIT. They have a MoveIT plug |
2022-01-14 09:32:58 -0500 | received badge | ● Rapid Responder (source) |
2022-01-05 13:21:34 -0500 | commented answer | How to transform point clouds from multiple Gazebo Kinects in the world coordinates? Actually, you shouldn't need to compose the matrix, it already exists so you should be able to grab it off the TF tree s |
2022-01-05 13:13:03 -0500 | received badge | ● Rapid Responder (source) |
2022-01-05 13:13:03 -0500 | answered a question | How to transform point clouds from multiple Gazebo Kinects in the world coordinates? It doesn't look like you ever transform the point clouds in your python script. You can use something like this: from |
2022-01-04 06:49:40 -0500 | commented answer | List of standardized topics for navigation using custom robot ROS uses SI units so the linear velocity is meters per second and the angular velocity is radians per second. A little m |
2022-01-03 10:05:20 -0500 | received badge | ● Rapid Responder (source) |
2022-01-03 10:05:20 -0500 | answered a question | List of standardized topics for navigation using custom robot One of the easiest things is to use the rosmsg command. Or, if you don't know the type of a topic is, you can use the ro |
2021-12-10 12:37:41 -0500 | answered a question | Does ROS have anything similar to a "global variable"? If you're only using the data in a single node, the most common way to do this would be to make your node a class, then |
2021-12-10 12:37:41 -0500 | received badge | ● Rapid Responder (source) |
2021-12-02 06:24:50 -0500 | marked best answer | How to command velocity in MoveIt! Hello, I'm trying to command an arm in MoveIt! using a 3D mouse. I'm not working with real hardware, or gazebo, its all using MoveIt! So far I am able to get the current joint positions of the robot, then whenever I get input from the 3D mouse I use that as input for a KDL inverse kinematics velocity solver. From that, I am able to generate an array of joint velocities. Now my problem that I've been stumped on for a couple days is, what can I do with that velocity array to make the robot model move in RViz? I thought I could use robot_state.setVariableVelocity() for each of the joints but that doesn't work. I've tried robot_state.integrateVariableVelocity() and a number of other things, but no luck. Maybe I'm missing something simple but I've stared at the documentation for hours and can't figure out how this is suppose to be done. Ultimately I want be able to command the arm with the mouse and have the system stop me before colliding with obstacles, but right now I'm really just trying to figure out how to tele-op an arm in moveit! using velocity control. Any help would be appreciated, Thanks, |
2021-11-19 09:14:33 -0500 | commented answer | Waiting for callback before control loop Also, are you sure you want disable_signals=True in your init_node? |
2021-11-19 09:10:15 -0500 | commented answer | Waiting for callback before control loop umm, I'm not sure why it would do that. Maybe do the wait_for_message before you even create the class object like this |
2021-11-19 08:04:49 -0500 | edited answer | Waiting for callback before control loop I think you can wait for the first LaserScan message before setting up your publisher in the class constructor...somethi |
2021-11-19 08:04:23 -0500 | received badge | ● Rapid Responder (source) |
2021-11-19 08:04:23 -0500 | answered a question | Waiting for callback before control loop I think you can wait for the first LaserScan message before setting up your publisher in the class constructor...somethi |
2021-09-24 07:24:29 -0500 | commented question | Compare two odoms with different frames with EVO You might have better luck opening an issue with EVO or looking through previous issues. |
2021-09-17 12:39:02 -0500 | received badge | ● Rapid Responder (source) |
2021-09-17 12:39:02 -0500 | answered a question | URDF Parsing Issue with .xacro file The error that says: Error [parser_urdf.cc:3183] Unable to call parseURDF on robot mode Makes me think there is a ty |
2021-09-10 10:08:24 -0500 | received badge | ● Rapid Responder (source) |
2021-09-10 10:08:24 -0500 | answered a question | How to create ground truth for the map from ROS You could make a CAD model of the lab (using something like Solidworks, Fusion 360, or TinkerCAD) and output it as a .st |
2021-08-30 10:16:39 -0500 | marked best answer | Why publish if no one is subscribing? Although it sounds like something my wife probably wonders about me sometimes - why do I talk if no one is listening? - I've wondered the same thing many times about a ROS publisher. I've never ran into any problems as long as everything is running on the same computer, but when I'm using two computers communicating over wireless, having data flow across the network that isn't getting used is costly. As an example, I have a pioneer with a kinect on it, most of the time the robot is in an autonomous mode and I don't use the wireless, but sometimes I want to teleoperate the robot and use the kinect's camera to see where I'm going. But running the openni node publishes a TON of data, even when I turn off almost everything in the launch file. There's so much data with all the different video feeds that the network gets flooded and the lag in the video makes it unuseable. Usually I'll just dive in hacking the code and cut out all unneeded topics and create a specialized node, but is there a better way? When I'm creating my own nodes that produce a lot of data, I usually use the publisher.getNumSubscribers() function to check if anyone is listening to a topic before publishing anything. My question is, is there a reason that all publishers don't do something similar by default? Would it be possible to add this capability to the core publisher code? Or is there something I'm missing? Thanks, Aaron |
2021-08-19 11:46:54 -0500 | received badge | ● Nice Answer (source) |
2021-08-17 09:39:32 -0500 | answered a question | ROS2 share launch code It depends on if the launch file of your third package is Python, XML, or YAML. Either way, how to do it is covered here |
2021-08-17 09:39:32 -0500 | received badge | ● Rapid Responder (source) |
2021-05-03 11:47:04 -0500 | commented question | How to model a non-backdrivable joint No unfortunately I never did solve it. |
2021-04-19 02:55:16 -0500 | received badge | ● Good Answer (source) |
2021-03-08 12:03:07 -0500 | received badge | ● Rapid Responder (source) |