Use a position sensor for Localization

asked 2020-04-16 08:39:19 -0600

Filippo gravatar image

updated 2020-04-16 08:42:00 -0600

I have a sensor that can give me absolute position in space in respect a point that I can choose and rotation about z axis. I have two question:

1) I created a struct in C++ named POSE that have four float numbers: pose.x pose.y pose.z pose.angle. This struct will contain my sensor data. I want to publish this struct over a topic and then write a subscriber to read. This is what I'have done so far:

int main (int argc, char** argv){
ros::init(argc, argv, "example_node");
ros::NodeHandle nh;
ros::Subscriber write_sub = nh.subscribe("write", 1000, write_callback);
ros::Publisher read_pub = nh.advertise<POSE>("read", 1000);
ros::Rate loop_rate(1);

while(ros::ok()){
    read_pub.publish(pose);
    ros::spinOnce();
    loop_rate.sleep();
}

and the listener is:

void chatterCallback(POSE pose)
{
  cout << "I heard\t" << pose.x;
  cout << "I heard\t" << pose.y;
  cout << "I heard\t" << pose.z;
  cout << "I heard\t" << pose.angle;

}

int main(int argc, char **argv)
{
  ros::init(argc, argv, "listener");
  ros::NodeHandle n;
  ros::Subscriber sub = n.subscribe("chatter", 1000, chatterCallback);
  ros::spin();
  return 0;
}

with all the included libraries and the definition of the Struct POSE, but it doesn't work. How can I fix them?

2) I want to use my data sensor to localize and drive my robot. What package should I use? And how can I modify my data to make them a standard ROS message for localization?

edit retag flag offensive close merge delete

Comments

1

Standard Messege Odometry might be ideal link text it is very often used as the basis of navigation and path planning etc. Its often a result of a sensorfusion node to make it more accurate. It seems in your case the data is 100% accurate anyway, so i would say its the way to go. May I ask what kind of system you are using to get this data? For IMUs there is another standard messege link text

Dragonslayer gravatar image Dragonslayer  ( 2020-04-16 09:32:24 -0600 )edit

@Dragonslayer thanks for your help. But what is PoseTwistCovariance and how I get it? I'm using a sensor based on UWB with a magnometer inside that can give me the rotation

Filippo gravatar image Filippo  ( 2020-04-16 09:44:35 -0600 )edit
1

Good old covariance matrix, its a love hate topic. If your position is "good" you shouldnt bother to much (have a read: link text. As I understand it its used to optimize "noisy"/imperfect data. Your pose data most likely is already conditioned.

Dragonslayer gravatar image Dragonslayer  ( 2020-04-16 10:34:32 -0600 )edit

@Dragonslayer you suggest to use nav_msgs/Odometry but I haven't any sensor that can give me velocity, I can only receive absolute position x y z and rotation angle. What do you think about geometry_msgs/PoseStamped (http://docs.ros.org/melodic/api/geome...). And what package should I use for localization?

Filippo gravatar image Filippo  ( 2020-04-17 01:33:24 -0600 )edit
1

From the position and time one can calculate/approximate velocity. I think the ekf_localization node has this function, I think its called "differential" there should be code for this. The reason for suggesting odometry is that its used by all localization, navigation and mapping packages I encountered. Not having an odometry topic might give you lots of problems down the line when the nodes demand it.

Dragonslayer gravatar image Dragonslayer  ( 2020-04-17 09:18:24 -0600 )edit

@Dragonslayer all clear! I'll see the ewk_localization node, thanks. But what package do you suggest me to start with localization and navigation?

Filippo gravatar image Filippo  ( 2020-04-17 09:33:38 -0600 )edit

As my suggested "fake" odometry messege is alredy corrected there is no need for additional localization, as its used for correcting odometry drift/error. Serious navigation includes obstacle avoidance but for this you need lidar, 3d camera, or at minimum ultrasonic or similar. At this point your robot has no clue of the world arround it. You might try a drawn map and acml node connected to move base with its path planning nodes. However my first question would be how to control the motors. You will get cmd_vel commands out of move_base and you have to interpret them for your motor controls and get them out of your pc to the robot hardware = HardwareInterface. If this is up and running you can look into ros_control and all its available controllers. Is your robot diff drive?

Dragonslayer gravatar image Dragonslayer  ( 2020-04-17 11:16:39 -0600 )edit

@Dragonslayer but I have a question. I have only a sensor (that can give me x,y,z position and quaternion). Is it correct to use EKF_location_node? What I learnt is that ekf use more than one inputs to obtain better odometry.

Filippo gravatar image Filippo  ( 2020-04-20 09:02:11 -0600 )edit