Ask Your Question

Aquas's profile - activity

2021-05-26 15:36:55 -0500 marked best answer Robot localization fusion

Hello!!

I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source that provides the difference between the previous position and the current one by giving a traslation vector and a quaternion.

IMU data is in baselink frame and the other source is in the reference frame (I selected odom). I thought of using a PoseWithCovarianceStamped for the second source, but I'm not sure if robot localization works with variations in pose, so the questions are:

Is there any way I can fuse them with robot_localization? As the second source does not give a complete pose but a change in pose do I have to modify this data? Should I add all the previous estimations plus the current one to get a the "global" pose (instead of the variation) and send that one to EKF along with the IMU?

Thank you.

EDIT:

I was trying to fuse just my estimation to see how it performed, this is the yaml code:

frequency: 5
sensor_timeout: 2

two_d_mode: false

transform_time_offset: 0.0
transform_timeout: 0.0

print_diagnostics: false
debug: true
debug_out_file: /home/javier/pruebas/MasPruebas/Data/file.txt
publish_tf: true
publish_acceleration: false

map_frame: map              # Defaults to "map" if unspecified
odom_frame: odom            # Defaults to "odom" if unspecified
base_link_frame: erlecopter/imu_link  # Defaults to "base_link" if unspecified
world_frame: odom           # Defaults to the value of odom_frame if unspecified

odom0: /Camera_estimation
odom0_config:  [false,  false,  false,
                false,  false,  false,
                true,  true,   true,
                true,  true,  true,
                false,  false,  false]
odom0_differential: false;
odom0_relative: true

odom0_queue_size: 2


#imu0: /erlecopter/imu
#imu0_config:  [false, false, false,
#               false, false, false,
#               false, false, false,
#               true,  true,  true,
#               true, true, true]
#imu0_relative: true
#imu0_queue_size: 100


process_noise_covariance:  [diag(small values)]


initial_estimate_covariance: [diag(1e-6)]

And the main code is something like this:

void ImageCallback()
{
   /*Estimate displacements*/

      double interval=(stamp-Prev_time).toSec();

      nav_msgs::Odometry pub_msg;
      pub_msg.header.seq=seq;
      pub_msg.header.stamp=stamp;
      pub_msg.header.frame_id="odom";

      pub_msg.child_frame_id="odom";

      pub_msg.twist.twist.linear.x=tras.at<double>(0)/interval;
      pub_msg.twist.twist.linear.y=tras.at<double>(1)/interval;
      pub_msg.twist.twist.linear.z=tras.at<double>(2)/interval;

      pub_msg.twist.twist.angular.x=(angles.roll-Prev_angles.roll)/interval;
      pub_msg.twist.twist.angular.y=(angles.pitch-Prev_angles.pitch)/interval;
      pub_msg.twist.twist.angular.z=(angles.yaw-Prev_angles.yaw)/interval;

      pub_msg.twist.covariance=Covariance_default;

      Pub_estimation.publish(pub_msg);

      Prev_time=stamp;

}

int main(int argc, char **argv)
{
  ros::init(argc, argv, "image2angles");
  ros::NodeHandle nh;
  image_transport::ImageTransport it(nh);
  image_transport::Subscriber sub_img = it.subscribe("/erlecopter/bottom/image_raw", 1, imageCallback);
  ros::Subscriber sub_h = nh.subscribe("/erlecopter/imu", 1, imuCallback);
  ros::Subscriber sub_h2 = nh.subscribe("/odometry/filtered", 1, ScaleCallback);
  Pub_estimation=nh.advertise<nav_msgs::Odometry>("Camera_estimation",1);
  ros::spin();
}

I've checked the Camera_estimation topic and the messages are being published, but odometry/filtered isn't publishing anything.

2021-02-02 08:05:03 -0500 received badge  Student (source)
2020-12-27 20:29:50 -0500 received badge  Famous Question (source)
2020-12-21 22:56:21 -0500 received badge  Notable Question (source)
2020-11-11 14:03:52 -0500 received badge  Famous Question (source)
2020-10-18 15:18:26 -0500 received badge  Famous Question (source)
2020-10-02 21:24:33 -0500 received badge  Famous Question (source)
2020-08-31 03:46:59 -0500 marked best answer Robot_localization - Could not transform measurement into base_link

Hello!!!

I'm using robot_localization for 3D positioning and I'm giving it a try with one source. In this case, it is providing a TwistWithCovarianceStamped in odom frame and I'm trying to use the EKF just to see how it works, but it return a straight line. In the debug file it says

Could not transform measurement into erlecopter/imu_link. Ignoring...

  • I renamed base_link as erlecopter/imu_link

    Does anyone have any idea regarding how to solve this problem? The TwistWithCovarianceStamped message is properly sent.

I'll add the params file:

frequency: 5
sensor_timeout: 2

two_d_mode: false


transform_time_offset: 0.0
transform_timeout: 0.0

print_diagnostics: false
debug: true
debug_out_file: /home/javier/pruebas/MasPruebas/Data/file.txt
publish_tf: true
publish_acceleration: false

map_frame: map              # Defaults to "map" if unspecified
odom_frame: odom            # Defaults to "odom" if unspecified
base_link_frame: erlecopter/imu_link  # Defaults to "base_link" if unspecified
world_frame: odom           # Defaults to the value of odom_frame if unspecified


twist0: /Camera_estimation
twist0_config:  [false, false, false,
                false,  false,  false,
                true, true, true,
                true, true, true,
                false,  false,  false]
twist0_queue_size: 3

process_noise_covariance: ...
initial_estimate_covariance: ...
2020-08-29 10:47:34 -0500 answered a question Robot_localization - Could not transform measurement into base_link

I finally opted for not simulating the camera alone and mixed it with the IMU, then it worked (better and IMU-only). Don

2020-08-29 10:42:57 -0500 received badge  Famous Question (source)
2020-08-25 11:55:34 -0500 received badge  Notable Question (source)
2020-08-06 11:28:31 -0500 received badge  Popular Question (source)
2020-07-01 09:15:40 -0500 received badge  Notable Question (source)
2020-06-27 12:07:18 -0500 commented question Robot_localization - Could not transform measurement into base_link

Try to change it, and see what happens. -> It returns a completely wrong output. I have not touched the urdf file nor

2020-06-27 02:09:40 -0500 commented question Robot_localization - Could not transform measurement into base_link

No, I used odom because the information given was in odom frame of reference.

2020-06-26 11:33:31 -0500 commented question Robot_localization - Could not transform measurement into base_link

It just says no tf data received. Shouldn't robot_localization provide this odom to base_link transform from the given d

2020-06-26 11:21:42 -0500 received badge  Popular Question (source)
2020-06-26 05:07:22 -0500 asked a question Robot_localization - Could not transform measurement into base_link

Robot_localization - Could not transform measurement into base_link Hello!!! I'm using robot_localization for 3D positi

2020-06-26 01:22:28 -0500 commented answer Robot localization fusion

I'll take a look at that link. The odometry data is nowhere near those extreme outputs values that I get through EKF. (I

2020-06-25 13:49:34 -0500 marked best answer Error when launching ekf_localization_node

I'm trying to use robot_localization package to fuse IMU and a visual estimation but, when I try to run the .launch file, it returns an error.

Terminal output:

SUMMARY
========

CLEAR PARAMETERS
 * /ekf_se/

PARAMETERS
 * /ekf_se/base_link_frame: erlecopter/base_link
 * /ekf_se/control_config: [False, False, Fa...
 * /ekf_se/control_timeout: 0.2
 * /ekf_se/debug: False
 * /ekf_se/debug_out_file: /home/javier/prue...
 * /ekf_se/frequency: 30
 * /ekf_se/imu0: /erlecopter/imu
 * /ekf_se/imu0_config: [False, False, Fa...
 * /ekf_se/imu0_differential: False
 * /ekf_se/imu0_nodelay: False
 * /ekf_se/imu0_queue_size: 5
 * /ekf_se/imu0_relative: False
 * /ekf_se/imu0_remove_gravitational_acceleration: True
 * /ekf_se/initial_estimate_covariance: ['1e-9', 0, 0, 0,...
 * /ekf_se/map_frame: map
 * /ekf_se/odom_frame: odom
 * /ekf_se/pose0: Camera_estimation
 * /ekf_se/pose0_config: [True, True, True...
 * /ekf_se/pose0_differential: True
 * /ekf_se/pose0_nodelay: False
 * /ekf_se/pose0_queue_size: 1
 * /ekf_se/pose0_relative: False
 * /ekf_se/print_diagnostics: True
 * /ekf_se/process_noise_covariance: [0.05, 0, 0, 0, 0...
 * /ekf_se/publish_acceleration: False
 * /ekf_se/publish_tf: True
 * /ekf_se/sensor_timeout: 0.6
 * /ekf_se/stamped_control: False
 * /ekf_se/transform_time_offset: 0.0
 * /ekf_se/transform_timeout: 0.0
 * /ekf_se/two_d_mode: False
 * /ekf_se/use_control: False
 * /ekf_se/world_frame: odom
 * /rosdistro: indigo
 * /rosversion: 1.11.21

NODES
  /
    ekf_se (robot_localization/ekf_localization_node)

ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
ERROR: cannot launch node of type [robot_localization/ekf_localization_node]: robot_localization
ROS path [0]=/opt/ros/indigo/share/ros
ROS path [1]=/home/javier/pruebas/MasPruebas/useful_info/catkin_builts/src
ROS path [2]=/home/javier/simulation/ros_catkin_WS/src/catkin_simple
// More paths
ROS path [40]=/opt/ros/indigo/share
ROS path [41]=/opt/ros/indigo/stacks
No processes to monitor
shutting down processing monitor...

This is the launch file, and it refers to a yaml (ekf_template.yaml) file with all the parameters:

<launch>
  <node pkg="robot_localization" type="ekf_localization_node" name="ekf_se" clear_params="true">
    <rosparam command="load" file="$(find useful_info)/ekf_template.yaml" />

   <!--  Placeholder for output topic remapping
    <remap from="odometry/filtered" to=""/>
    <remap from="accel/filtered" to=""/>
    -->

  </node>
</launch>

Yaml file:

frequency: 30

sensor_timeout: 0.6

two_d_mode: false

transform_time_offset: 0.0
transform_timeout: 0.0

print_diagnostics: true

debug: false
debug_out_file: /home/javier/pruebas/MasPruebas/Data/Fallos.txt

publish_tf: true
publish_acceleration: false


map_frame: map              # Defaults to "map" if unspecified
odom_frame: odom            # Defaults to "odom" if unspecified
base_link_frame: erlecopter/base_link  # Defaults to "base_link" if unspecified
world_frame: odom           # Defaults to the value of odom_frame if unspecified


pose0: Camera_estimation
pose0_config: [true,  true,  true,
               true,  true,  true,
               false, false, false,
               false, false, false,
               false, false, false]
pose0_differential: true
pose0_relative: false
pose0_queue_size: 1
pose0_nodelay: false

imu0: /erlecopter/imu
imu0_config: [false, false, false,
               false, false, false,
               false, false, false,
               true,  true,  true,
               true,  true,  true]
imu0_differential: false
imu0_relative: false
imu0_queue_size: 5
imu0_nodelay: false
imu0_remove_gravitational_acceleration: true

use_control: false
stamped_control: false

process_noise_covariance: [numbers_matrix]

initial_estimate_covariance: [numbers_matrix]

Ubuntu 14.04 ROS Indigo

2020-06-25 13:47:59 -0500 received badge  Notable Question (source)
2020-06-24 02:12:25 -0500 received badge  Popular Question (source)
2020-06-22 09:24:53 -0500 commented answer Robot localization fusion

If I use yours (taking into account that mine is not a 2D movement) it does work, but it returns an output whose displac

2020-06-22 09:24:53 -0500 received badge  Commentator
2020-06-22 02:37:33 -0500 received badge  Popular Question (source)
2020-06-22 02:18:45 -0500 commented answer Robot localization fusion

I've added the relevant parts of the code to the main question

2020-06-22 02:17:56 -0500 edited question Robot localization fusion

Robot localization fusion Hello!! I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source

2020-06-21 09:36:00 -0500 commented answer Robot localization fusion

I have tried it and looking at the debugging file it says that it cannot transform into base_link frame. The velocities

2020-06-20 04:06:13 -0500 commented answer Robot localization fusion

Then, I should divide the traslations/rotations by the time between the messurements and save those values in the TwistW

2020-06-19 12:33:27 -0500 asked a question Robot localization fusion

Robot localization fusion Hello!! I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source

2020-06-19 09:29:59 -0500 received badge  Famous Question (source)
2020-05-17 12:58:20 -0500 asked a question How to write my covariance matrix to PoseWithCovarianceStamped

How to write my covariance matrix to PoseWithCovarianceStamped Hello! I want to write the covariance I have calculated i

2020-05-17 12:44:11 -0500 marked best answer No odometry/filtered output from robot_localization

Hello!

I'm trying to fuse IMU and a camera estimation but I receive no output through /odometry/filtered. If I try with IMU alone, it does send infromation through that topic, but when I try with the camera estimation,I have no response from the EKF node.

The camera sends a PoseWithCovarianceStamped message that tells the displacement between the previous frame and the current one in the UAV coordinate frame (called erlecopter/imu_link):

  geometry_msgs::PoseWithCovarianceStamped pub_msg;

  pub_msg.header.seq=seq;
  pub_msg.header.stamp=msg->header.stamp;
  pub_msg.header.frame_id="erlecopter/imu_link";

  pub_msg.pose.pose.position.x=tras.at<double>(0);
  pub_msg.pose.pose.position.y=tras.at<double>(1);
  pub_msg.pose.pose.position.z=tras.at<double>(2);

  pub_msg.pose.pose.orientation=Euler2Quat(angles);

  pub_msg.pose.covariance=Covariance_default;

  Pub_estimation.publish(pub_msg);

where, the publisher node has been defined like this:

ros::Publisher Pub_estimation;
Pub_estimation=nh.advertise<geometry_msgs::PoseWithCovarianceStamped>("Camera_estimation",1);

This message is properly as I have seen thanks to rostopic echo.

My parameters file for the EKF node is like this (for the camera only fusion):

frequency: 1
sensor_timeout: 2
two_d_mode: false
transform_time_offset: 0.0
transform_timeout: 0.0
print_diagnostics: true
debug: false
publish_tf: true
publish_acceleration: false

map_frame: map             
odom_frame: odom           
base_link_frame: erlecopter/imu_link  
world_frame: odom 


pose0: Camera_estimation
pose0_config:  [true, true, true,
               true, true, true,
               false, false, false,
               false,  false,  false,
               false,  false,  false]

process_noise_covariance: [0.05, 0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0.05, 0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0.06, 0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0.03, 0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0.03, 0,    0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0.06, 0,     0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0.025, 0,     0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0.025, 0,    0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0.04, 0,    0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0.01, 0,    0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0.01, 0,    0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0.02, 0,    0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0.01, 0,    0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0.01, 0,
                           0,    0,    0,    0,    0,    0,    0,     0,     0,    0,    0,    0,    0,    0,    0.015]
initial_estimate_covariance: [1e-9, 0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                              0,    1e-9, 0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                              0,    0,    1e-9, 0,    0,    0,    0,    0 ...
(more)
2020-05-13 08:32:46 -0500 commented answer No odometry/filtered output from robot_localization

I'll have to look further into how to create that TF tree, because I have no idea

2020-05-11 17:24:58 -0500 received badge  Notable Question (source)
2020-05-11 06:23:13 -0500 commented answer No odometry/filtered output from robot_localization

Failure at 1589196125.549205710 Exception thrown:"odom" passed to lookupTransform argument target_frame does not exist.

2020-05-11 05:04:45 -0500 commented question No odometry/filtered output from robot_localization

rqt_tf_tree says no data received. I have tried with rqt_graph and this is what I get: https://drive.google.com/file/d/1

2020-05-10 12:25:39 -0500 received badge  Popular Question (source)
2020-05-10 05:39:55 -0500 edited question No odometry/filtered output from robot_localization

No odometry/filtered output from robot_localization Hello! I'm trying to fuse IMU and a camera estimation but I receive

2020-05-09 11:36:36 -0500 commented question No odometry/filtered output from robot_localization

It is not the pose of the camera but the displacement of the UAV between the previous measure and the current one (it is

2020-05-08 08:33:16 -0500 commented question No odometry/filtered output from robot_localization

Yes, Camera_estimation is the topic name, and I have modified its data so that instead of being with respect to the came

2020-05-08 05:26:17 -0500 received badge  Notable Question (source)
2020-05-08 04:51:18 -0500 asked a question No odometry/filtered output from robot_localization

No odometry/filtered output from robot_localization Hello! I'm trying to fuse IMU and a camera estimation but I receive

2020-04-29 05:37:25 -0500 edited question Error when launching ekf_localization_node

Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimat

2020-04-29 05:37:16 -0500 edited question Error when launching ekf_localization_node

Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimat

2020-04-29 05:37:16 -0500 received badge  Editor (source)
2020-04-29 05:31:44 -0500 asked a question Error when launching ekf_localization_node

Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimat

2020-04-28 09:43:48 -0500 received badge  Enthusiast
2020-04-15 14:21:19 -0500 received badge  Popular Question (source)
2020-04-14 08:26:09 -0500 marked best answer Reference frames in robot_localization

Hello! I've got a couple of questions:

  1. I've been reading robot_localization documentation (http://docs.ros.org/melodic/api/robot...) and I've seen that they only use North-East-Up references:

As of this writing, robot_localization assumes an ENU frame for all IMU data, and does not work with NED frame data. This may change in the future, but for now, users should ensure that data is transformed to the ENU frame before using it with any node in robot_localization.

Nontheless, in that same page, if you look at IMU information, they say:

The state estimation nodes in robot_localization assume that an IMU that is placed in its neutral right-side-up position on a flat surface will: Measure +9.81 meters per second squared for the Z axis.

In that case, if gravity is possitive, the Z vector is pointing down. Isn't this the opposite of what was stated above? Should my IMU gravity component be positive(NED) or negative(ENU)?

.2 I am measuring the variation of the angles and the movement from images taken from the bottom camera of a UAV and fusing them with the IMU. IMU updates values each 2.5ms and the information from the image is extracted each ~100ms, so the pose information from the camera is given with respect to the previous frame (~100ms ago). Does the robot_localization takes into account that pose values are with respect to the previous meassure given by that sensor, or should I change something?

Thank you in advance.