Step by Step Guide to integrate hardware.
Hello, I have a robot with 4 wheels, 2 dc motors with encoder, 2 dc motor without encoders. One arduino mega, 1 imu sensor, 1 intel realsense D415. I want to autonomous navigate the robot. I searched a lot but not found relevant content. Please guide me by telling steps. I will follow all steps to achieve my goal. Second thing I just want to navigate in the environment for that is is necessary to create urdf file and tf files? Please breifly guide me as I am very new to ROS. Thanks you
Asked by Soleman on 2022-01-22 13:47:26 UTC
Answers
The standard approach to solving this problem is:
- Use the
ros_control
package to interface with your motor driver code, this is what does the translation from your requested joint positions/velocities/efforts to actual motor movement. - To perform sensor integration, you'll need to create a URDF of your robot. There's plenty of tutorials available for that, it's too lengthy to go into that here. ROS will use this when performing various sensor/joint calculations for you.
- Once you can move the platform appropriately, you'll want to integrate your sensors to provide odometry data. The most common way to do this is via a kalman filter, there are several packages for that kind of thing.
- With the ability to move the robot and detect where you're moving, you'll need to integrate your obstacle detection method. You've mentioned having the d415, that's a depth camera so in your case you'll probably want to use something like depthimage_to_laserscan to produce something on the /scan topic for the rest of the stack.
- You should now have a robot that you can drive around, that has a basic understanding of where it is, and that can perceive obstacles. You're now ready to integrate SLAM (the actual mapping/autonomy part). So you'll next need to integrate in the
move_base
package, which handles connecting all of your pieces together into a path planner. - Next, you'll need to be able to generate a map. Your best bet here is
gmapping
, there are alternatives but this is the most frequently used and so it will have the most helpful documentation for you to make use of. - Finally, you'll need to be able to localize.
AMCL
is probably your best bet here for the same reason asgmapping
.
That's the full formula to getting complete SLAM autonomy in ROS, best of luck!
Asked by cst0 on 2022-01-23 23:35:26 UTC
Comments
Hey @cst0 thanks for helping me out. I am able to get map using gmapping but due to unavailability of odometry the gmapping only maps around a fix frame means map updates it self around fix location. how to cater this. Please verify if this approach is right or not. The second thing I have Arduino mega and dc motors with encoders and IMU how I get odometry information using these resources. Please provide me the code that publishes odometry msgs and pose msgs using these resources.
Asked by Soleman on 2022-02-04 05:29:12 UTC
Like I mentioned, use ros_control. This will handle your encoders and will provide wheel odometry. The rest is the sensor fusion I mentioned. You need to get this working before you can think about mapping, because the lack of sensor fusion means that you do not have a correct odometry source for gmapping to use.
Asked by cst0 on 2022-02-04 10:36:09 UTC
Hey @cst0. How can I integrate Arduino to ros control? I have already integrated Arduino using rosserial_arduino library. But how ros_control will give command to arduino. Please guide me.
Asked by Soleman on 2022-02-07 05:13:03 UTC
Comments