ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

Step by Step Guide to integrate hardware.

asked 2022-01-22 12:47:26 -0600

Soleman gravatar image

Hello, I have a robot with 4 wheels, 2 dc motors with encoder, 2 dc motor without encoders. One arduino mega, 1 imu sensor, 1 intel realsense D415. I want to autonomous navigate the robot. I searched a lot but not found relevant content. Please guide me by telling steps. I will follow all steps to achieve my goal. Second thing I just want to navigate in the environment for that is is necessary to create urdf file and tf files? Please breifly guide me as I am very new to ROS. Thanks you

edit retag flag offensive close merge delete

1 Answer

Sort by ยป oldest newest most voted
0

answered 2022-01-23 22:35:26 -0600

cst0 gravatar image

The standard approach to solving this problem is:

  • Use the ros_control package to interface with your motor driver code, this is what does the translation from your requested joint positions/velocities/efforts to actual motor movement.
  • To perform sensor integration, you'll need to create a URDF of your robot. There's plenty of tutorials available for that, it's too lengthy to go into that here. ROS will use this when performing various sensor/joint calculations for you.
  • Once you can move the platform appropriately, you'll want to integrate your sensors to provide odometry data. The most common way to do this is via a kalman filter, there are several packages for that kind of thing.
  • With the ability to move the robot and detect where you're moving, you'll need to integrate your obstacle detection method. You've mentioned having the d415, that's a depth camera so in your case you'll probably want to use something like depthimage_to_laserscan to produce something on the /scan topic for the rest of the stack.
  • You should now have a robot that you can drive around, that has a basic understanding of where it is, and that can perceive obstacles. You're now ready to integrate SLAM (the actual mapping/autonomy part). So you'll next need to integrate in the move_base package, which handles connecting all of your pieces together into a path planner.
  • Next, you'll need to be able to generate a map. Your best bet here is gmapping, there are alternatives but this is the most frequently used and so it will have the most helpful documentation for you to make use of.
  • Finally, you'll need to be able to localize. AMCL is probably your best bet here for the same reason as gmapping.

That's the full formula to getting complete SLAM autonomy in ROS, best of luck!

edit flag offensive delete link more

Comments

Hey @cst0 thanks for helping me out. I am able to get map using gmapping but due to unavailability of odometry the gmapping only maps around a fix frame means map updates it self around fix location. how to cater this. Please verify if this approach is right or not. The second thing I have Arduino mega and dc motors with encoders and IMU how I get odometry information using these resources. Please provide me the code that publishes odometry msgs and pose msgs using these resources.

Soleman gravatar image Soleman  ( 2022-02-04 04:29:12 -0600 )edit

Like I mentioned, use ros_control. This will handle your encoders and will provide wheel odometry. The rest is the sensor fusion I mentioned. You need to get this working before you can think about mapping, because the lack of sensor fusion means that you do not have a correct odometry source for gmapping to use.

cst0 gravatar image cst0  ( 2022-02-04 09:36:09 -0600 )edit

Hey @cst0. How can I integrate Arduino to ros control? I have already integrated Arduino using rosserial_arduino library. But how ros_control will give command to arduino. Please guide me.

Soleman gravatar image Soleman  ( 2022-02-07 04:13:03 -0600 )edit

Question Tools

2 followers

Stats

Asked: 2022-01-22 12:47:26 -0600

Seen: 326 times

Last updated: Jan 23 '22