ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
1 | initial version |
Hi @jimc91
I am going to answer your question but I am sure there are community members here that do know more than me about this particular topic, but I will try, so here we go:
sensor_node_type
is refering to all your sensor drivers that you need to produce sensors readings. For your particular set up, a Lidar and a Camera.sensor_node_type
is refering to the ROS wrappers that use the sensor readings and convert them into ROS usable msgs.sensor_node_name
is self explanatory and refers to a unique name to identify each used sensor.sensor_param
refers to all those needed params for your sensor configuration, that you may find in your sensors official repository, or even in already made ROS wrappers for those sensors. Since you have a camera you will also need to determine intrinsic and extrinsic calibration parameters.Then you have the odom_node_pkg
refering to any package or utility able to provide any sort of odometry information. You have multiple approach for this.
Finally the transform_configuration_pkg
is refering to having a proper tf_tree
to relate any robot frame that you may have. If you have any doubts of tfs and frames you can check this. The usual (and standard tf tree) is needed to know the relation between points in space in any robot frame in which you can work. So, as a basic mobile robot you will have something like:
World --> Map --> Odom --> Base_footprint --> Base_link --> wheels, sensors, ...
And this is used to operate properly in the frame in which you are navigating with your robot. For the Base_footprint onward the tfs use to be static, so they can be loaded with the robot_description
, however the rest are not static and are usually produced by your odometry/localization nodes. For instance, AMCL is able to produce a transformation between map
and odom
allowing you to localize in the map
frame, the robot_localization
can be configured to produce several transformation as well, even your own odometry node can have a transform broadcaster to produce the Base_footprint --> Odom transform
.
So to sum up, you will need sensor drivers to produce proper readings, ROS wrappers to use the sensors with ROS utilities, a good source of odometry information and a proper tf tree to be able to localize yourself in the enviroment.
Hope that can help you understanding this things. If anyone is willing to add more information about this I will glad to discuss it here.
Regards.