ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

BuilderMike's profile - activity

2022-09-19 06:02:02 -0500 received badge  Popular Question (source)
2022-08-15 12:44:13 -0500 marked best answer custom message fails to build; no such file or directory

I followed the custom message documentation to the letter, and searched all the related questions on here, and unfortunately I am still stuck. Could you please have a look?

I am getting the same error building in Ubuntu 14.04 and 16.04, both running Kinetic.

All my packages that have custom messages fail to build (catkin_make exits with error). The custom message definition are part of the package, and are not from another package. The error reason is the same for all the packages with custom messages (from my limited understanding the .msg file is not found for some reason): fatal error: ... : No such file or directory ... > image description

image description

The package organization follows the documentation: image description

where the custom message file is called my_msg.msg and is inside the msg directory. The contents of my_msg.msg are not suspect either:

int32 my_int
float64 my_float

The package.xml does contain the needed lines:

<build_depend>message_generation</build_depend>
<run_depend>message_runtime</run_depend>

The CMakeLists.txt contains everything the documentation mentions, and I even ran catkin_create_pkg test to make sure all the items are arranged in the correct order. Here is the file:

cmake_minimum_required(VERSION 2.8.3)

project(g_custom_messages)

find_package(catkin REQUIRED COMPONENTS 
    roscpp 
    rospy
    std_msgs 
    message_generation
)

add_message_files(
  FILES
  my_msg.msg
)

generate_messages(DEPENDENCIES std_msgs )

catkin_package(CATKIN_DEPENDS message_runtime)

include_directories(include ${catkin_INCLUDE_DIRS})

add_executable(custom_msg_subscriber src/custom_msg_subscriber.cpp)

target_link_libraries(custom_msg_subscriber ${catkin_LIBRARIES})

and in the source code (file name is custom_msg_subscriber.cpp and it is inside the src directory), I am refrencing the custom message as #include <g_custom_messages/my_msg.h>, and am accessing the data via:

void poseMessageReceived(const g_custom_messages::my_msg &msg) 
{
    storeInt = msg.my_int;
    storeFloat = msg.my_float;
}

All the web search gymnastics I did have not helped. Could you please let me know why am I getting an error?

2022-08-15 12:44:13 -0500 received badge  Good Answer (source)
2022-08-15 12:44:13 -0500 received badge  Enlightened (source)
2022-06-24 14:49:02 -0500 marked best answer How to utilize *.so file in CMakeLists.txt?

I am very new to ROS and C++, and I apologize if my question is obvious.

I have looked over very many questions and answers and tried the code offered by the answers; I looked over the documentation at http://wiki.ros.org/catkin/CMakeLists... ; I moved files from /usr/local/lib to /opt/ros/indigo/lib; and I tried very many different lines of code in CMakeLists.txt and I still cannot make my code compile.

I am using Ubuntu 14.04, with Indigo and Catkin as the build tool.

The Error:

jimmy_state_publisher.cpp:(.text+0x310): undefined reference to `LJUSB_OpenDevice'
jimmy_state_publisher.cpp:(.text+0x81b): undefined reference to `LJUSB_Write'
jimmy_state_publisher.cpp:(.text+0x851): undefined reference to `LJUSB_CloseDevice'
jimmy_state_publisher.cpp:(.text+0x876): undefined reference to `LJUSB_Read'
lots more of the same type of error....

I am using a device called 'Labjack U3'. The driver for this device is called exodriver https://labjack.com/support/software/... . I downloaded the driver and installed it. It generated a liblabjackusb.so file in /usr/local/lib. All the undefined reference errors are methods/functions implemented in the driver source file. How do I modify my CMakeLists.txt to allow my only executable (jimmy_state_publisher.cpp) in the ROS package to use the liblabjackusb.so file? What modifications in the CMakeLists.txt file do I need to make for the package to compile?

CMakeLists.txt:

cmake_minimum_required(VERSION 2.8.3)

project(jimmy_v2)

find_package(catkin REQUIRED COMPONENTS roscpp rospy sensor_msgs std_msgs tf)

catkin_package()

include_directories(include ${catkin_INCLUDE_DIRS})

add_executable(jimmy_state_publisher src/jimmy_state_publisher.cpp)

target_link_libraries(jimmy_state_publisher ${catkin_LIBRARIES})

I did #include "labjackusb.h" in the executable file that I created.

How do I modify the CMakeLists.txt file to take the labjackusb.so file into consideration? or if that is the wrong question, how do I build my project successfully?

If any information that you need is omitted I apologize.... simply let me know what else I need to include.

Thank you in advance.

2022-03-21 23:38:55 -0500 received badge  Nice Question (source)
2022-01-04 08:42:03 -0500 marked best answer How to build a 3D model in RViz using data from a Sick LMS200?

Hello. I have Sick LMS200 30106 (sitting on top of a servo) and it is connected to my PC via RS422 to USB. Here is what it looks like:

image description

I am using Ubuntu 14.04 with ROS indigo. I am using the sicktoolbox_wrapper package to pass the scan data to RViz (via the /scan topic). I am able to see a single scan line at a time in RViz:

image description

I am trying to create a 3D model like this gentleman: https://www.youtube.com/watch?v=-Tpqw... . In the comments he alludes to a SickLms package, but I have not been able to find it in the ROS packages in the software section. I played a bunch with all the settings in the LaserScan type in RViz, and read http://wiki.ros.org/rviz/DisplayTypes... , and I could not figure out how to make the data appear in 3D. I also searched all over the web, but could not find any instructions on how to achieve this objective (except this - http://library.isr.ist.utl.pt/docs/ro... , but it is too complex for me to understand if this is what I need to pursue).

I am new to ROS, and from what I saw online I am not the first person who is very new to ROS to seek this objective. Could you please provide some guidance on how to do this? Does a package exist for this purpose? And if not, which packages do I need to learn to achieve my objective?

Thanks in advance!

2021-11-08 18:12:11 -0500 received badge  Nice Question (source)
2021-10-28 01:16:45 -0500 received badge  Great Question (source)
2021-10-21 11:10:08 -0500 commented question advice for autodock vs. custom node

@gvdhoorn I have a feeling you very aware of the robotic landscape, and I have a question. Are you aware of any robot co

2021-10-21 10:53:38 -0500 commented question advice for autodock vs. custom node

@gvdhoorn Thanks for you perspective on the package. I was already leaning towards writing my own code, and was looking

2021-10-20 15:55:00 -0500 asked a question advice for autodock vs. custom node

advice for autodock vs. custom node Hello. I built a custom mobile robot that runs Noetic on Ubuntu 20.04; and the ROS N

2021-10-07 02:02:00 -0500 received badge  Famous Question (source)
2021-07-25 20:34:00 -0500 marked best answer ERROR: cannot launch node of type [...]: can't locate node [...] in package [...]

This system is running Ubuntu 16.04 64bit with Kinetic. The project builds properly with catkin_make; and right after source devel/setup.bash is run in the same terminal. As soon as the project is launched via roslaunch jimmy_lidar_motor_control lidar_motor_control.launch the following error appears:

ERROR: cannot launch node of type [jimmy_lidar_motor_control/lidar_motor_control]: can't locate node [lidar_motor_control] in package [jimmy_lidar_motor_control]

The project structure is pretty standard:

└── src
    ├── jimmycpp
    │   ├── //unrelated files (I think)
    │   ├── //unrelated files (I think)
    │   └── //unrelated files (I think)
    └── jimmy_lidar_motor_control
        ├── CMakeLists.txt
        ├── include
        │   └── jimmy_lidar_motor_control
        │       └── lidar_motor_control.h
        ├── launch
        │   └── lidar_motor_control.launch
        ├── msg
        │   └── motion_command_to_execute.msg
        ├── package.xml
        └── src
            └── lidar_motor_control.cpp

the launch file is very simple:

<?xml version="1.0" encoding="UTF-8"?>
<launch>
    <node name="jimmy_lidar_motor_control_mynode" 
          pkg="jimmy_lidar_motor_control" 
          type="lidar_motor_control"
          output="screen"/>
 </launch>

To figure out what files ROS can "see" i ran rosrun command and pressed TAB-TAB, and all the files in the directory are visible:

mo@Home-W530:~/Desktop/workspace$ rosrun jimmy_lidar_motor_control 
CMakeLists.txt                  lidar_motor_control.launch
jimmy_lidar_motor_control_node  motion_command_to_execute.msg
lidar_motor_control.cpp         package.xml
lidar_motor_control.h

Everything is in C++ so I don't think it's related to file permissions (output of ls -la):

-rwxrwxrwx 1 mo mo 7287 Jan 19 12:32 lidar_motor_control.cpp

All the questions I looked through did not solve my issue:

Here is the CMakeLists.txt in case the issue is there:

# What version of CMake is needed?
cmake_minimum_required(VERSION 2.8.3)

project(jimmy_lidar_motor_control)

## Compile as C++11, supported in ROS Kinetic and newer
add_compile_options(-std=c++11)

find_package(catkin REQUIRED COMPONENTS
  roscpp
  rospy
  std_msgs
   jimmycpp
  message_generation
)

#declare any custom files that I have created
add_message_files(
  FILES
  motion_command_to_execute.msg
)

#the custom message I am declaring relies on data types found here
generate_messages(DEPENDENCIES std_msgs )

catkin_package( 
  INCLUDE_DIRS include
#  LIBRARIES my_robot_base
  CATKIN_DEPENDS roscpp message_runtime
#  DEPENDS system_lib
)

# Specify locations of header files.
include_directories(include ${catkin_INCLUDE_DIRS})

#  NOTE: executable name and source file name should be the same!!!!
add_executable(${PROJECT_NAME}_node src/lidar_motor_control.cpp)

target_link_libraries(${PROJECT_NAME}_node serial ${catkin_LIBRARIES})

# this line is to be used anytime I am using custom messages. It prevents build 
# errors when I run `catkin_make`. The first argument is the same name as the 
# name of the source file. The second argument is the name of the package plus 
# add “_generate_messages_cpp”. This solution to my problem is described on 
# http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber(c%2B%2B) in section
add_dependencies(${PROJECT_NAME}_node
  ${${PROJECT_NAME}_EXPORTED_TARGETS}
  ${catkin_EXPORTED_TARGETS}
  jimmy_lidar_motor_control_generate_messages_cpp
)

in case the entire output of roslaunch jimmy_lidar_motor_control lidar_motor_control.launch which causes this problem is useful, here it is:

mo@Home-W530:~/Desktop/workspace$ roslaunch jimmy_lidar_motor_control lidar_motor_control.launch 
... logging to /home/mo/.ros/log/749beb5c-3ae5-11ea-9996-e09d3128cfa4/roslaunch-Home-W530-6823.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://Home-W530:37099/

SUMMARY
========

PARAMETERS
 * /rosdistro: kinetic
 * /rosversion: 1.12.14

NODES
  /
    jimmy_lidar_motor_control_mynode (jimmy_lidar_motor_control/lidar_motor_control)

auto-starting new ...
(more)
2021-04-20 11:22:56 -0500 received badge  Good Question (source)
2021-03-02 03:36:10 -0500 marked best answer How to resize meshes in URDF?

I am currently drawing out my robot using URDF. I am at the end of the first URDF tutorial (see http://wiki.ros.org/urdf/Tutorials/Bu... ). My problem: once I added my mesh, it is very disproportionate to the rest of the drawing, and I have no idea how to resize the mesh. The tutorial states that: "Meshes can also be sized using relative scaling parameters or a bounding box size"

where do I find an example? Where is the documentation? and if no documentation or examples exist, could you please provide an example of code for a "bounding box" or "relative scaling"?

More detail:

image before meshes are added in urdf file:

image description

image after meshes are added in urdf file:

image description

the code importing the meshes:

   <!-- *********************(21) LMS LINK - MESHES****************************-->
  <!-- a "link" is the part that we are creating. All the internal tags describe
       this part. This is the body of Jimmy (both square levels) -->
  <link name="LMS_link">
    <visual>
      <geometry>
        <!--box dimensions is Meters. L X W X H where the L X H is a rectangle, 
            and the H extrudes it upwards -->
        <mesh filename="package://urdf_tutorial/meshes/LMS-200-30106.dae"/>
      </geometry>
      <origin rpy="0 0 0" xyz="0 0 0"/>
      <material name="white"/>
    </visual>
  </link>

    <!-- *********************(22) LMS to LMS PLATE JOINT **********************-->
    <!-- a joint allows us to create a relationship between two links (parts)-->
      <joint name="LMS_to_LMS_plate_joint" type="fixed">
        <parent link="LMS_plate_link"/>
        <child link="LMS_link"/>
        <!-- this is the point at which the two parts attach to one another-->
        <origin xyz="0 0 0"/>
      </joint>

I am building a URDF because it was suggested here: https://answers.ros.org/question/2718...

2021-02-27 11:48:50 -0500 received badge  Favorite Question (source)
2020-12-12 16:01:00 -0500 commented question Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription)

@thodor I used a vanilla 20.04 that I downloaded on Oct 30; and I'm pretty sure the kernel was a 5.4.0, as that is the k

2020-12-12 16:00:09 -0500 commented question Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription)

@thodor I used a vanilla 20.04 that I downloaded on Oct 30; and I'm pretty sure the kernel was a 5.4.0, as that is the k

2020-12-12 15:53:32 -0500 commented question Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription)

@thodor I used a vanilla 20.04 that I downloaded on Oct 30; and I'm pretty sure the kernel was a 5.4.0, as that is the k

2020-11-22 05:21:07 -0500 marked best answer What are the steps to use diff_drive_controller in a physical robot?

What steps need to be followed to usediff_drive_controller on a physical robot that is using differential drive (including all packages diff_drive_controller needs)?

I need a bridge between geometry_msgs/Twist messages and my physical robot.

my robot

The code to send messages to my stepper drives is finished (the two robot wheels are turned via stepper motors that are controller by Ethernet stepper drives). All I need is a way to translate geometry_msgs/Twist to my code that tells the stepper drives how to move (number of steps, accel, decel, velocity, etc....). This will allow me to:

  1. Test the robot, by publishing to the topic manually (and getting robot motion)
  2. Convert Navigation stack messages to robot motion.

I am running Ubuntu 16.04 with Kinetic.

Some background on the question: For this robot I did all the mechanical, electrical and coding (both design and implementation) myself. As you can imagine, I am not very strong at any one of those 3 things (I am a very, very persistent generalist). At this point I find myself in programmer territory, and I am getting to a point where the documentation is getting very hard to digest or I can't find any.

I have looked through the diff_drive_controller source code, and the package overall. I have looked over the diff_drive_wiki (including the so called diff_drive_controller website). I have looked over all 25 FAQs. None of those places have tutorials, or something I can use to get going. What I know is what I learned from the 25 questions and answers for the FAQs for diff_drive_controller:

  • I need to create a hardware_interface to abstract my hardware for the diff_drive_controller . I found a sample here: https://github.com/eborghi10/my_ROS_m...
  • I have to describe my robot using robot_description.
  • I understand that ros_control is somehow related to ros_controllers and that diff_drive_controller is part of ros_controllers.
  • I now understand that this package is suitable for “real-time”, but I don't need that functionality.
  • I understand that diff_drive_controller is not a typical node, and I need a controller_manager.
  • the video https://vimeo.com/107507546 provided by Adolfo Rodriguez offered in one of the answers does a great job of explaining interaction between packages (as a high level overview).

The problem is that all those things that I understand do not get me any closer to utilizing these wonderful packages as I have no idea where to start, where to continue, and what the finished code looks like. I am not the first person to build a robot with two drive wheels (as it's the easiest robot to build), who insists to use ROS (thank you so much to all you ROS people, you are beautiful), who has many talents, but is not a programmer. Could you provide a list of steps that people like me can follow to convert geometry_msgs/Twist to low level code our robot understands? Could you please make each step small enough for a beginner to follow?

EDIT #1

The ros_control packages are all ...

(more)
2020-11-22 05:21:04 -0500 received badge  Good Question (source)
2020-10-31 18:06:54 -0500 edited answer PointCloud2 "Large" Points

It looks like this issue is a bug with RViz. See: https://github.com/ros-visualization/rviz/issues/1508 However, in th

2020-10-31 18:04:43 -0500 answered a question rviz crashes when point cloud style is changed to Points

I had the same issue. When changing to the Points, they are very large, and make the PC work extra hard. If other opti

2020-10-31 18:03:25 -0500 edited answer PointCloud2 "Large" Points

It looks like this issue is a bug with RViz. See: https://github.com/ros-visualization/rviz/issues/1508 However, in th

2020-10-31 17:18:47 -0500 answered a question PointCloud2 "Large" Points

It looks like this issue is a bug with RViz. See: https://github.com/ros-visualization/rviz/issues/1508 However, in th

2020-10-31 15:30:12 -0500 commented question PointCloud2 "Large" Points

@eschoof I have the same issue. Did you find a work around or solution? Looks like it's an RViz bug: https://github.com/

2020-10-31 14:28:02 -0500 commented question PointCloud2 "Large" Points

@eschoof I have the same issue. Did you find a work around or solution?

2020-10-30 22:59:46 -0500 commented question Intel Realsens on Ubuntu 20.04 + ROS noetic (installation desription)

Thanks a ton! This really helped me!

2020-08-20 16:00:48 -0500 received badge  Nice Question (source)
2020-08-11 06:34:15 -0500 received badge  Famous Question (source)
2020-07-28 16:37:35 -0500 commented answer advice for SLAM with 3D lidar

@Dragonslayer THANK YOU!!! I didn't even think of the 60cm of dead space. That makes a HUGE impact on me! Thanks for poi

2020-07-27 08:50:53 -0500 commented answer advice for SLAM with 3D lidar

@Dragonslayer The lider used to be in that position, and it's not ideal, as other issues start to happen (mapping algori

2020-07-27 07:33:37 -0500 marked best answer advice for SLAM with 3D lidar

Hello. The project is running Ubuntu 16.04, with Kinetic on an Intel PC.

Some background: I designed and built a robot, and was at the SLAM phase. The turtlebot tutorials (https://learn.turtlebot.com/) are a great guide to SLAM for a person like me. Then, I experienced a real kick in the pants - it turns out that the current offering of SLAM packages is geared towards horizontal (planar) lidar, and not vertical lidar like the one I built (see: https://answers.ros.org/question/3466...). Well, life is a learning experience so I built a new horizontal 3D lidar system:

image description

At present the new horizontal/planar lidar system is hanging onto the robot with zip ties, and needs to be mounted onto the robot:

image description

Before I rip out the vertical 3D lidar system, and replace it with the horizontal system I need to decide the height at which to place the new horizontal lidar. I have a group of related questions that I am hoping will guide the placement:

  • It is my understanding that gmapping is the recommended mapping engine for the Navigation stack (https://wiki.ros.org/navigation/MapBu...). Is gmapping still the best tool for creating a 2D map (given the 3D lidar)?
  • I want to create a 2D map, but avoid obstacles using the full 3D lidar data (exactly like the video on the navigation stack home page: https://wiki.ros.org/navigation). Using the navigation stack with gmapping, and amcl will I be able to reach this objective?
  • Can you please recommend package combinations that will allow the robot to build a 2D map, localize in the map, and navigate to points on the map while avoiding obstacles using the full horizontal 3D lidar data?

I am pre-emptively asking these questions becasue I don't want to rebuild my robot to later learn I positioned the lidar in a way that does not work optimally with the current SLAM offerings.

Thanks a ton for your time!

Mike


Edit #1 - I appreciate all suggestions for packages that prevent bumping into the top of the table as I navigate around the table legs; also I don't want to run over things laying on the floor. On closer analysis, it looks like like the PR2 has a pitching lidar AND a fixed lidar, AND rgbd cameras. I am really hoping to get some guidance on package selection for building a map, localizing, and navigating to points on the map (map of my apartment - small area). I would prefer to do it all via lidar, but if more "cheap" hardware will really help, I am very open to those suggestions. Any full working solutions are very, very appreciated. Thanks again.

2020-07-26 13:13:16 -0500 commented answer advice for SLAM with 3D lidar

@stevemacenski Hi Steve, I implemented your answer in my physical robot. During the lidar motion when lidar is horizont

2020-07-26 11:18:56 -0500 commented answer advice for SLAM with 3D lidar

@stevemacenski Hi Steve, I implemented your answer in my physical robot. During the lidar motion when lidar is horizont