Revision history [back]

So, how should I interpret the sensor_msgs/LaserScan.angle_min?

Hello,

I am using the RPLidar A2 on my ROSbot 2.0 for environment detection.

I am currently trying to implement a side-distance measurer that measures the distance of four sides, let's say NORTH, EAST, SOUTH and WEST. However, rather than being the directions relative to the robot body, I want to measure the sides in exact degrees of 0, 90, 180 and 270, with respect to the initial theta orientation of the robot being 0.

To do this, I implemented a function that takes in a global angle, and based on scan attributes and current yaw, returns the index in ranges that the degree is at.

angle_to_index = lambda angle: int( ((angle - (yaw+scan.angle_min)) % (2*math.pi))/scan.angle_increment )

Let's assume that the initial orientation is robot facing NORTH. When I request the readings at degree 0 with angle_to_index(0), the readings fit that of the SOUTH direction.

That makes me think, if the scan.angle_min value is -3.12, which I suppose is relative to the front of the robot, then scan.ranges[0] should return a reading from the back of the vehicle, right? But in this case, it looks like readings around the scan.angle_min correspond to the front of the robot, which confused me.

So, how should I interpret the sensor_msgs/LaserScan.angle_min?

So, how should I interpret the sensor_msgs/LaserScan.angle_min?

Hello,

I am using the RPLidar A2 on my ROSbot 2.0 for environment detection.

I am currently trying to implement a side-distance measurer that measures the distance of four sides, let's say NORTH, EAST, SOUTH and WEST. However, rather than being the directions relative to the robot body, I want to measure the sides in exact degrees of 0, 90, 180 and 270, with respect to the initial theta orientation of the robot being 0.

To do this, I implemented a function that takes in a global angle, and based on scan attributes and current yaw, returns the index in ranges that the degree is at.

angle_to_index = lambda angle: int( ((angle - (yaw+scan.angle_min)) % (2*math.pi))/scan.angle_increment )

Let's assume that the initial orientation is robot facing NORTH. When I request the readings at degree 0 with angle_to_index(0), the readings fit that of the SOUTH direction.

That makes me think, if the scan.angle_min value is -3.12, which I suppose is relative to the front of the robot, then scan.ranges[0] should return a reading from the back of the vehicle, right? But in this case, it looks like readings around the scan.angle_min correspond to the front of the robot, which confused me.

So, how should I interpret the sensor_msgs/LaserScan.angle_min?

So, how How should I interpret the sensor_msgs/LaserScan.angle_min?

Hello,

I am using the RPLidar A2 on my ROSbot 2.0 for environment detection.

I am currently trying to implement a side-distance measurer that measures the distance of four sides, let's say NORTH, EAST, SOUTH and WEST. However, rather than being the directions relative to the robot body, I want to measure the sides in exact degrees of 0, 90, 180 and 270, with respect to the initial theta orientation of the robot being 0.

To do this, I implemented a function that takes in a global angle, and based on scan attributes and current yaw, returns the index in ranges that the degree is at.

angle_to_index = lambda angle: int( ((angle - (yaw+scan.angle_min)) % (2*math.pi))/scan.angle_increment )

Let's assume that the initial orientation is robot facing NORTH. When I request the readings at degree 0 with angle_to_index(0), the readings fit that of the SOUTH direction.

That makes me think, if the scan.angle_min value is -3.12, which I suppose is relative to the front of the robot, then scan.ranges[0] should return a reading from the back of the vehicle, right? But in this case, it looks like readings around the scan.angle_min correspond to the front of the robot, which confused me.

So, how should I interpret the sensor_msgs/LaserScan.angle_min?