ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2022-10-12 04:49:36 -0500 | marked best answer | IIWA giving position but not moving I'm trying to work though a demo provided by ROS-I and get my IIWA moving. for reference this is the demo with instructions I'm referring to https://industrial-training-master.re... . I have read through both versions of the wiki https://github.com/IFL-CAMP/iiwa_stac... and https://bitbucket.org/khansari/iiwa/w... . My current issue is that I can launch the ROS node on the IIWA and it seems to run attached is an image of what the teach pendent shows when the ROS node is running. I run the ROS master side with simulated sensors and a real robot. it connects and the simulation gets the current starting position of my robot but once the planning is done and I tell it to execute on the hardware nothing happens. in the wiki there is a FAQ that mentions robot sending position but not moving is a networking issue. I went and checked the locations that it suggested looking into and it still isn't working. the other thing is that I originally used IP addresses for the koni net and my ros master of 192.168.1.100 and 192.168.1.120 but I remembered seeing something in the kuka manual about invalid IP addresses so I changed these to the addresses that the tutorial says they use of 160.69.69.69 and 160.69.69.100. I'm still really new to ros so there may be troubleshooting tools available that I'm not aware of. let me know if there is anything specific I'm not including here./WidKQxkFfskM6dFj6) I dont think i'm adding an image correctly. I think this link will take you to my teach pendent image. https://photos.app.goo.gl/WidKQxkFfsk... |
2022-03-14 05:07:24 -0500 | received badge | ● Famous Question (source) |
2022-02-14 02:57:37 -0500 | received badge | ● Famous Question (source) |
2021-03-17 05:58:46 -0500 | received badge | ● Famous Question (source) |
2021-03-17 05:58:46 -0500 | received badge | ● Notable Question (source) |
2021-03-17 05:58:46 -0500 | received badge | ● Popular Question (source) |
2020-12-05 10:41:55 -0500 | received badge | ● Famous Question (source) |
2020-03-02 23:36:55 -0500 | received badge | ● Famous Question (source) |
2020-02-24 04:25:33 -0500 | received badge | ● Notable Question (source) |
2020-02-24 04:25:33 -0500 | received badge | ● Popular Question (source) |
2019-12-17 11:41:02 -0500 | received badge | ● Notable Question (source) |
2019-11-12 13:30:32 -0500 | received badge | ● Notable Question (source) |
2019-11-12 13:30:32 -0500 | received badge | ● Popular Question (source) |
2019-10-26 03:35:33 -0500 | received badge | ● Taxonomist |
2019-07-10 13:57:05 -0500 | edited question | Trouble moving from simulation to robot Trouble moving from simulation to robot I have a kuka iiwa that I'm trying to move using moveit using the iiwa stack htt |
2019-07-10 13:35:19 -0500 | asked a question | Trouble moving from simulation to robot Trouble moving from simulation to robot I have a kuka iiwa that I'm trying to move using moveit using the iiwa stack htt |
2019-07-10 12:34:41 -0500 | marked best answer | Robot arm colliding with workcell I'm having an issue in my robot cell where the robot is sitting on the edge of a table and the table is outlined by four posts. running the robot to a random location in the center of the table I have about a 50 percent chance of the robot pathing through one of the posts close to it. I've read through some of the moveit docs such as this http://docs.ros.org/melodic/api/movei... and I've tried to play with "longest _valid_segment_fraction" for my planner. I don't know that I've seen any changes though. I've also tried to make the collision tags for each of the posts slightly larger so the sample checker has a higher probability of checking in collision but that hasn't helped either. I guess my first question is how exactly do I use the "longest _valid_segment_fraction". my assumption was to place it inside the specific motion planner headder like the following. are there any other ways of preventing the collisions? Thanks |
2019-07-10 12:34:32 -0500 | answered a question | Robot arm colliding with workcell I would first check your collision geometry. Does your cell have proper collision geometry specified? C |
2019-07-10 12:34:32 -0500 | received badge | ● Rapid Responder (source) |
2019-07-10 12:31:40 -0500 | commented question | Robot arm colliding with workcell You were right, I forgot that I set an artificial ceiling so I didnt need to make collision objects for the upper rails. |
2019-07-10 10:09:46 -0500 | received badge | ● Famous Question (source) |
2019-07-10 10:09:46 -0500 | received badge | ● Notable Question (source) |
2019-07-10 09:04:15 -0500 | commented question | Robot arm colliding with workcell I fixed it but cant reopen it myself. |
2019-07-10 08:58:19 -0500 | edited question | Robot arm colliding with workcell Robot arm colliding with workcell I'm having an issue in my robot cell where the robot is sitting on the edge of a table |
2019-07-10 08:57:33 -0500 | edited question | Robot arm colliding with workcell Robot arm colliding with workcell I'm having an issue in my robot cell where the robot is sitting on the edge of a table |
2019-07-10 08:55:37 -0500 | edited question | Robot arm colliding with workcell Robot arm colliding with workcell I'm having an issue in my robot cell where the robot is sitting on the edge of a table |
2019-07-10 07:48:29 -0500 | asked a question | Robot arm colliding with workcell Robot arm colliding with workcell I'm having an issue in my robot cell where the robot is sitting on the edge of a table |
2019-07-03 09:18:43 -0500 | marked best answer | Trouble with Approximate time sync I've been trying to get approximate time sync to work on a project for a few days now. the call back just never seems to fire. I decided to start over and make a couple simple nodes to look at everything from a simpler project, as I've read things on the forums like people struggle to sync more that 2 topics. So I made a simple publisher node that publishes three header messages on individual topics. and a simple subscriber node that tries to sync the first two topics for now. I did everything following the use guide in the message filter page. I cant even get the simple example to build ( my original does build). The specific error is 'value' is not a member of 'ros::message_traits::TimeStamp<std_msgs::string_<std::allocator<void>>, void>' I found a forum post that mentioned that a similar error was caused by not using a message with a header. My guess is that I'm doing something small like not initializing the headers properly or something like that. I'll post the code from both nodes and hopefully someone has a suggestion. |
2019-07-02 13:17:35 -0500 | answered a question | Trouble with Approximate time sync I could be wrong, but a "message with a header" is not the same as "a header" (as in: a std_msgs/Header doesn't |
2019-07-02 13:17:35 -0500 | received badge | ● Rapid Responder |
2019-07-02 13:16:01 -0500 | commented question | Trouble with Approximate time sync That did it. I moved to pointcloud2 as the message type and it started firing. Thanks! |
2019-07-02 08:01:20 -0500 | edited question | Trouble with Approximate time sync Trouble with Approximate time sync I've been trying to get approximate time sync to work on a project for a few days now |
2019-07-02 07:57:46 -0500 | edited question | Trouble with Approximate time sync Trouble with Approximate time sync I've been trying to get approximate time sync to work on a project for a few days now |
2019-07-02 07:55:32 -0500 | asked a question | Trouble with Approximate time sync Trouble with Approximate time sync I've been trying to get approximate time sync to work on a project for a few days now |
2019-06-21 10:54:45 -0500 | received badge | ● Teacher (source) |
2019-06-21 10:54:45 -0500 | received badge | ● Self-Learner (source) |
2019-06-21 07:56:59 -0500 | edited answer | Simulating real camera data The answer was suggested by mch, This message means you don't have TF (http://wiki.ros.org/tf) between those cam |
2019-06-21 07:55:33 -0500 | marked best answer | Simulating real camera data Hi guys, Is there a way I can simulate camera data similarly to how rosbag works? I tried rosbag but was missing tf data so that didn't work. is it as simple as adding another topic to record? looking through the available topics I don't see anything that looks like what I need. In essence what I want to do is, record the data stream of my cameras to pass to my perception pipeline so I can start working on making the robot move to a pick location. I'd like to work at my desk if possible and not at the robot since there isn't much room for my laptop there. Thanks |
2019-06-21 07:55:33 -0500 | received badge | ● Scholar (source) |
2019-06-21 07:55:04 -0500 | answered a question | Simulating real camera data The answer was suggested by mch, I needed to also record tf_static. |
2019-06-21 07:52:58 -0500 | received badge | ● Commentator |
2019-06-21 07:52:58 -0500 | commented question | Simulating real camera data Sorry it took so long to reply, your suggestion was correct. Thanks! |
2019-06-07 19:58:53 -0500 | received badge | ● Famous Question (source) |
2019-06-07 19:58:53 -0500 | received badge | ● Notable Question (source) |
2019-05-29 23:09:18 -0500 | received badge | ● Popular Question (source) |
2019-05-28 10:24:32 -0500 | commented question | Simulating real camera data When I subscribe to all on the rosbag with a -a it crashes my software and it wont run. I went back and rerecorded the c |