LIDAR sensing backwards
Using RPLIDAR A1M8 on ROS2, navigation 2. I am getting very buggy and odd movement and then during SLAM testing I realized that it is detecting backwards.
I put my hand in front of the robot/LIDAR and then the RVIZ updates behind the robot and also left/right is mixed up.
I cant find much of anything online for this issue anyone seen this?
REP-103: Standard Units of Measure and Coordinate Conventions - Coordinate Frame Conventions dictates the "forward direction" is along X+.
Is your lidar (or its TF frame) mounted with its X+ frame in the forward direction in your URDF?
If not, that could be the cause of what you describe.
And pedantic, but your lidar is not "sending backwards" (as it can't do that). It's more likely the data is being interpreted in an incorrect way due to a modelling error/mistake.
According to the URDF model when I visualize it, the LIDAR is on the front of the robot and pointing forward towards X+.
I looked into the URDF .xacro file and was able to change the last digit of the 'rpy' to 3.14 and that worked.
Now it senses properly and my localization is working. However I am having two major issues trying to figure out:
1) I can use SLAM and save a map, then when I open navigation it loses the map and starts in the unknown. Picture 1
2) I have odometry issues I believe. When I set initial post the RVIZ map starts to vibrate, and then the planned path is super choppy and robot takes forever to get to even the simplest route ...(more)
@philmurp It's ok for you to answer your own question if you figured it out.
Regarding your new questions: this site is organized as 1 question per post. After you make an effort to search the site for a answers to your new questions, feel free to ask a new question if you can't find a good answer. I'll point out that there are many existing answers related to configuring slam.
Good input I should not have layered on to this same question.
Actually I think this is still giving me trouble. Although my laser scans make sense when I put objects up to the LIDAR, as you can see in this Picture ; the frame of the laser is still opposite from the rest of the robot base. So when navigating the arrow comes up pointing backwards when I move forwards, I can imagine this is resulting in a lot of my issues with obstacle avoidance etc.
I tries flipping all the parameters, I just cant seem to figure out how to adjust the incoming laser scans from flipping. Also in that picture you can see the map and odom frame as disconnected from the bas, sure that isnt good either.
There is nothing wrong with a sensor's Transform Frame not aligned with the
base_link
frame. It happens all the time. If it bothers you, then change how the lidar is mounted on the robot (and make the corresponding change to the urdf.)You have not provided enough information for me to know what "the arrow comes up pointing backwards" means.
When I change rpy as above to 3.14, the laser scans perform as normal: I put an obstacle in front of the robot and the scan shows it in front (where as previously I put obstacle in front of the robot and the scan behind the robot would adjust).
But the laser axes are opposite than the rest of the base. The X-axis on the laser is pointing backwards when visualizing in URDF / SLAM when showing axes. So when I run navigation and give a goal and the robot starts moving towards that spot, RVIZ generates a "circle" around the robot and then an arrow of movement, that arrow is pointing backwards as the robot moves forward.
I am not sure, but think this might be why the robot doesnt avoid obstacles, I can usually send the robot to a location but if I put something in front of ...(more)