ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2022-09-19 06:02:02 -0500 | received badge | ● Popular Question (source) |
2022-08-15 12:44:13 -0500 | marked best answer | custom message fails to build; no such file or directory I followed the custom message documentation to the letter, and searched all the related questions on here, and unfortunately I am still stuck. Could you please have a look? I am getting the same error building in Ubuntu 14.04 and 16.04, both running Kinetic. All my packages that have custom messages fail to build ( The package organization follows the documentation: where the custom message file is called my_msg.msg and is inside the msg directory. The contents of The package.xml does contain the needed lines: The CMakeLists.txt contains everything the documentation mentions, and I even ran and in the source code (file name is All the web search gymnastics I did have not helped. Could you please let me know why am I getting an error? |
2022-08-15 12:44:13 -0500 | received badge | ● Good Answer (source) |
2022-08-15 12:44:13 -0500 | received badge | ● Enlightened (source) |
2022-06-24 14:49:02 -0500 | marked best answer | How to utilize *.so file in CMakeLists.txt? I am very new to ROS and C++, and I apologize if my question is obvious. I have looked over very many questions and answers and tried the code offered by the answers; I looked over the documentation at http://wiki.ros.org/catkin/CMakeLists... ; I moved files from /usr/local/lib to /opt/ros/indigo/lib; and I tried very many different lines of code in CMakeLists.txt and I still cannot make my code compile. I am using Ubuntu 14.04, with Indigo and Catkin as the build tool. The Error: I am using a device called 'Labjack U3'. The driver for this device is called exodriver https://labjack.com/support/software/... . I downloaded the driver and installed it. It generated a liblabjackusb.so file in /usr/local/lib. All the undefined reference errors are methods/functions implemented in the driver source file. How do I modify my CMakeLists.txt to allow my only executable (jimmy_state_publisher.cpp) in the ROS package to use the liblabjackusb.so file? What modifications in the CMakeLists.txt file do I need to make for the package to compile? CMakeLists.txt: I did How do I modify the CMakeLists.txt file to take the labjackusb.so file into consideration? or if that is the wrong question, how do I build my project successfully? If any information that you need is omitted I apologize.... simply let me know what else I need to include. Thank you in advance. |
2022-03-21 23:38:55 -0500 | received badge | ● Nice Question (source) |
2022-01-04 08:42:03 -0500 | marked best answer | How to build a 3D model in RViz using data from a Sick LMS200? Hello. I have Sick LMS200 30106 (sitting on top of a servo) and it is connected to my PC via RS422 to USB. Here is what it looks like: I am using Ubuntu 14.04 with ROS indigo. I am using the sicktoolbox_wrapper package to pass the scan data to RViz (via the /scan topic). I am able to see a single scan line at a time in RViz: I am trying to create a 3D model like this gentleman: https://www.youtube.com/watch?v=-Tpqw... . In the comments he alludes to a SickLms package, but I have not been able to find it in the ROS packages in the software section. I played a bunch with all the settings in the LaserScan type in RViz, and read http://wiki.ros.org/rviz/DisplayTypes... , and I could not figure out how to make the data appear in 3D. I also searched all over the web, but could not find any instructions on how to achieve this objective (except this - http://library.isr.ist.utl.pt/docs/ro... , but it is too complex for me to understand if this is what I need to pursue). I am new to ROS, and from what I saw online I am not the first person who is very new to ROS to seek this objective. Could you please provide some guidance on how to do this? Does a package exist for this purpose? And if not, which packages do I need to learn to achieve my objective? Thanks in advance! |
2021-11-08 18:12:11 -0500 | received badge | ● Nice Question (source) |
2021-10-28 01:16:45 -0500 | received badge | ● Great Question (source) |
2021-10-21 11:10:08 -0500 | commented question | advice for autodock vs. custom node @gvdhoorn I have a feeling you very aware of the robotic landscape, and I have a question. Are you aware of any robot co |
2021-10-21 10:53:38 -0500 | commented question | advice for autodock vs. custom node @gvdhoorn Thanks for you perspective on the package. I was already leaning towards writing my own code, and was looking |
2021-10-20 15:55:00 -0500 | asked a question | advice for autodock vs. custom node advice for autodock vs. custom node Hello. I built a custom mobile robot that runs Noetic on Ubuntu 20.04; and the ROS N |
2021-10-07 02:02:00 -0500 | received badge | ● Famous Question (source) |
2021-07-25 20:34:00 -0500 | marked best answer | ERROR: cannot launch node of type [...]: can't locate node [...] in package [...] This system is running Ubuntu 16.04 64bit with Kinetic. The project builds properly with The project structure is pretty standard: the launch file is very simple: To figure out what files ROS can "see" i ran rosrun command and pressed TAB-TAB, and all the files in the directory are visible: Everything is in C++ so I don't think it's related to file permissions (output of All the questions I looked through did not solve my issue:
Here is the CMakeLists.txt in case the issue is there: in case the entire output of (more) |
2021-04-20 11:22:56 -0500 | received badge | ● Good Question (source) |
2021-03-02 03:36:10 -0500 | marked best answer | How to resize meshes in URDF? I am currently drawing out my robot using URDF. I am at the end of the first URDF tutorial (see http://wiki.ros.org/urdf/Tutorials/Bu... ). My problem: once I added my mesh, it is very disproportionate to the rest of the drawing, and I have no idea how to resize the mesh. The tutorial states that: "Meshes can also be sized using relative scaling parameters or a bounding box size" where do I find an example? Where is the documentation? and if no documentation or examples exist, could you please provide an example of code for a "bounding box" or "relative scaling"? More detail: image before meshes are added in urdf file: image after meshes are added in urdf file: the code importing the meshes: I am building a URDF because it was suggested here: https://answers.ros.org/question/2718... |
2021-02-27 11:48:50 -0500 | received badge | ● Favorite Question (source) |
2020-12-12 16:01:00 -0500 | commented question | Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription) @thodor I used a vanilla 20.04 that I downloaded on Oct 30; and I'm pretty sure the kernel was a 5.4.0, as that is the k |
2020-12-12 16:00:09 -0500 | commented question | Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription) @thodor I used a vanilla 20.04 that I downloaded on Oct 30; and I'm pretty sure the kernel was a 5.4.0, as that is the k |
2020-12-12 15:53:32 -0500 | commented question | Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription) @thodor I used a vanilla 20.04 that I downloaded on Oct 30; and I'm pretty sure the kernel was a 5.4.0, as that is the k |
2020-11-22 05:21:07 -0500 | marked best answer | What are the steps to use diff_drive_controller in a physical robot? What steps need to be followed to use I need a bridge between The code to send messages to my stepper drives is finished (the two robot wheels are turned via stepper motors that are controller by Ethernet stepper drives). All I need is a way to translate
I am running Ubuntu 16.04 with Kinetic. Some background on the question: For this robot I did all the mechanical, electrical and coding (both design and implementation) myself. As you can imagine, I am not very strong at any one of those 3 things (I am a very, very persistent generalist). At this point I find myself in programmer territory, and I am getting to a point where the documentation is getting very hard to digest or I can't find any. I have looked through the
The problem is that all those things that I understand do not get me any closer to utilizing these wonderful packages as I have no idea where to start, where to continue, and what the finished code looks like. I am not the first person to build a robot with two drive wheels (as it's the easiest robot to build), who insists to use ROS (thank you so much to all you ROS people, you are beautiful), who has many talents, but is not a programmer. Could you provide a list of steps that people like me can follow to convert geometry_msgs/Twist to low level code our robot understands? Could you please make each step small enough for a beginner to follow? EDIT #1 (more) |
2020-11-22 05:21:04 -0500 | received badge | ● Good Question (source) |
2020-10-31 18:06:54 -0500 | edited answer | PointCloud2 "Large" Points It looks like this issue is a bug with RViz. See: https://github.com/ros-visualization/rviz/issues/1508 However, in th |
2020-10-31 18:04:43 -0500 | answered a question | rviz crashes when point cloud style is changed to Points I had the same issue. When changing to the Points, they are very large, and make the PC work extra hard. If other opti |
2020-10-31 18:03:25 -0500 | edited answer | PointCloud2 "Large" Points It looks like this issue is a bug with RViz. See: https://github.com/ros-visualization/rviz/issues/1508 However, in th |
2020-10-31 17:18:47 -0500 | answered a question | PointCloud2 "Large" Points It looks like this issue is a bug with RViz. See: https://github.com/ros-visualization/rviz/issues/1508 However, in th |
2020-10-31 15:30:12 -0500 | commented question | PointCloud2 "Large" Points @eschoof I have the same issue. Did you find a work around or solution? Looks like it's an RViz bug: https://github.com/ |
2020-10-31 14:28:02 -0500 | commented question | PointCloud2 "Large" Points @eschoof I have the same issue. Did you find a work around or solution? |
2020-10-30 22:59:46 -0500 | commented question | Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription) Thanks a ton! This really helped me! |
2020-08-20 16:00:48 -0500 | received badge | ● Nice Question (source) |
2020-08-11 06:34:15 -0500 | received badge | ● Famous Question (source) |
2020-07-28 16:37:35 -0500 | commented answer | advice for SLAM with 3D lidar @Dragonslayer THANK YOU!!! I didn't even think of the 60cm of dead space. That makes a HUGE impact on me! Thanks for poi |
2020-07-27 08:50:53 -0500 | commented answer | advice for SLAM with 3D lidar @Dragonslayer The lider used to be in that position, and it's not ideal, as other issues start to happen (mapping algori |
2020-07-27 07:33:37 -0500 | marked best answer | advice for SLAM with 3D lidar Hello. The project is running Ubuntu 16.04, with Kinetic on an Intel PC. Some background: I designed and built a robot, and was at the SLAM phase. The turtlebot tutorials (https://learn.turtlebot.com/) are a great guide to SLAM for a person like me. Then, I experienced a real kick in the pants - it turns out that the current offering of SLAM packages is geared towards horizontal (planar) lidar, and not vertical lidar like the one I built (see: https://answers.ros.org/question/3466...). Well, life is a learning experience so I built a new horizontal 3D lidar system: At present the new horizontal/planar lidar system is hanging onto the robot with zip ties, and needs to be mounted onto the robot: Before I rip out the vertical 3D lidar system, and replace it with the horizontal system I need to decide the height at which to place the new horizontal lidar. I have a group of related questions that I am hoping will guide the placement:
I am pre-emptively asking these questions becasue I don't want to rebuild my robot to later learn I positioned the lidar in a way that does not work optimally with the current SLAM offerings. Thanks a ton for your time! Mike Edit #1 - I appreciate all suggestions for packages that prevent bumping into the top of the table as I navigate around the table legs; also I don't want to run over things laying on the floor. On closer analysis, it looks like like the PR2 has a pitching lidar AND a fixed lidar, AND rgbd cameras. I am really hoping to get some guidance on package selection for building a map, localizing, and navigating to points on the map (map of my apartment - small area). I would prefer to do it all via lidar, but if more "cheap" hardware will really help, I am very open to those suggestions. Any full working solutions are very, very appreciated. Thanks again. |
2020-07-26 13:13:16 -0500 | commented answer | advice for SLAM with 3D lidar @stevemacenski Hi Steve, I implemented your answer in my physical robot. During the lidar motion when lidar is horizont |
2020-07-26 11:18:56 -0500 | commented answer | advice for SLAM with 3D lidar @stevemacenski Hi Steve, I implemented your answer in my physical robot. During the lidar motion when lidar is horizont |