Robot distance/angle positioning with sensors
Hello guys,
In my project - which I successfully simulated with Gazebo ROS my robots needs to position itself exactly in perpendicular position to a metallic surface - specifically 3 meters from the side-door of a car so that the front of the robot points the same direction of the car.
I successfully solved the issue in Gazebo ROS using a lidar sensor.
The algorithm works the following way. I calculated the distance/angle from the target surface using 3 different scan ranges from the lidar output. Using a few trigonometric functions I was able to calculate the angle correction that had to be applied to the robot base_link to be perfectly perpendicular to the target.
So everything is perfect in simulation!
The pain now comes in the real world...
I tested my align-base function in real world using a RPLidar (robot peak lidar). I noticed that the lidar scan output on a black metal surface is very unreliable. Consider that black and metal is very common on a car body (I didn't test on other colors yet)
Now my question is: did I use the wrong type of sensor for this sort of application - or there's any work around you can suggest to solve the issue?
Can't wait to hear your opinions!
Thanks!
Hello! I want to create a specific car in to implement in ROS and i can't find a something like tutorial or examples of creating cars from scratch (these cars should look well, like your car) and simulate them in rviz. Do you have some ideas which programs or packages I should use? I know that is my answer is not common with topic but I can't send you a private message :) Thanks a lot