Robotics StackExchange | Archived questions

Use a position sensor for Localization

I have a sensor that can give me absolute position in space in respect a point that I can choose and rotation about z axis. I have two question:

1) I created a struct in C++ named POSE that have four float numbers: pose.x pose.y pose.z pose.angle. This struct will contain my sensor data. I want to publish this struct over a topic and then write a subscriber to read. This is what I'have done so far:

int main (int argc, char** argv){
ros::init(argc, argv, "example_node");
ros::NodeHandle nh;
ros::Subscriber write_sub = nh.subscribe("write", 1000, write_callback);
ros::Publisher read_pub = nh.advertise<POSE>("read", 1000);
ros::Rate loop_rate(1);

while(ros::ok()){
    read_pub.publish(pose);
    ros::spinOnce();
    loop_rate.sleep();
}

and the listener is:

void chatterCallback(POSE pose)
{
  cout << "I heard\t" << pose.x;
  cout << "I heard\t" << pose.y;
  cout << "I heard\t" << pose.z;
  cout << "I heard\t" << pose.angle;

}

int main(int argc, char **argv)
{
  ros::init(argc, argv, "listener");
  ros::NodeHandle n;
  ros::Subscriber sub = n.subscribe("chatter", 1000, chatterCallback);
  ros::spin();
  return 0;
}

with all the included libraries and the definition of the Struct POSE, but it doesn't work. How can I fix them?

2) I want to use my data sensor to localize and drive my robot. What package should I use? And how can I modify my data to make them a standard ROS message for localization?

Asked by Filippo on 2020-04-16 08:39:19 UTC

Comments

Standard Messege Odometry might be ideal link text it is very often used as the basis of navigation and path planning etc. Its often a result of a sensorfusion node to make it more accurate. It seems in your case the data is 100% accurate anyway, so i would say its the way to go. May I ask what kind of system you are using to get this data? For IMUs there is another standard messege link text

Asked by Dragonslayer on 2020-04-16 09:32:24 UTC

@Dragonslayer thanks for your help. But what is PoseTwistCovariance and how I get it? I'm using a sensor based on UWB with a magnometer inside that can give me the rotation

Asked by Filippo on 2020-04-16 09:44:35 UTC

Good old covariance matrix, its a love hate topic. If your position is "good" you shouldnt bother to much (have a read: link text. As I understand it its used to optimize "noisy"/imperfect data. Your pose data most likely is already conditioned.

Asked by Dragonslayer on 2020-04-16 10:34:32 UTC

@Dragonslayer you suggest to use nav_msgs/Odometry but I haven't any sensor that can give me velocity, I can only receive absolute position x y z and rotation angle. What do you think about geometry_msgs/PoseStamped (http://docs.ros.org/melodic/api/geometry_msgs/html/msg/PoseStamped.html). And what package should I use for localization?

Asked by Filippo on 2020-04-17 01:33:24 UTC

From the position and time one can calculate/approximate velocity. I think the ekf_localization node has this function, I think its called "differential" there should be code for this. The reason for suggesting odometry is that its used by all localization, navigation and mapping packages I encountered. Not having an odometry topic might give you lots of problems down the line when the nodes demand it.

Asked by Dragonslayer on 2020-04-17 09:18:24 UTC

@Dragonslayer all clear! I'll see the ewk_localization node, thanks. But what package do you suggest me to start with localization and navigation?

Asked by Filippo on 2020-04-17 09:33:38 UTC

As my suggested "fake" odometry messege is alredy corrected there is no need for additional localization, as its used for correcting odometry drift/error. Serious navigation includes obstacle avoidance but for this you need lidar, 3d camera, or at minimum ultrasonic or similar. At this point your robot has no clue of the world arround it. You might try a drawn map and acml node connected to move base with its path planning nodes. However my first question would be how to control the motors. You will get cmd_vel commands out of move_base and you have to interpret them for your motor controls and get them out of your pc to the robot hardware = HardwareInterface. If this is up and running you can look into ros_control and all its available controllers. Is your robot diff drive?

Asked by Dragonslayer on 2020-04-17 11:16:39 UTC

@Dragonslayer but I have a question. I have only a sensor (that can give me x,y,z position and quaternion). Is it correct to use EKF_location_node? What I learnt is that ekf use more than one inputs to obtain better odometry.

Asked by Filippo on 2020-04-20 09:02:11 UTC

Its a little difficult, as all rosnodes are made for a specific purpose and repurposing them might not work, depending on the actual programming. The best way of finding a solution in ROS as I have learned is to start with a running system, a turtlebot for example, and then go from there. There are so many variables that can play tricks on you, that you never really know whats actually the problem. When it comes to ekf_localization I everywhere read "odometry", so without an odometry messege published somwhere is likely to fail. But odometry is just x, y and angular.z information (2d) and you seem to have that information, but then if you pack that in an odometry messege I dont see where ekf_localizatioin would give you any additional benefits, at this time without imu etc. There is a base_controller.py in the Tork-A/Spur directory on github, where an odometry messege is created, its just ca 20 lines of code withou the complex trigonometry. (github seeems to be down for me at the mome

Asked by Dragonslayer on 2020-04-21 11:23:53 UTC

Answers