2021-05-26 15:36:55 -0500 | marked best answer | Robot localization fusion Hello!! I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source that provides the difference between the previous position and the current one by giving a traslation vector and a quaternion. IMU data is in baselink frame and the other source is in the reference frame (I selected odom). I thought of using a PoseWithCovarianceStamped for the second source, but I'm not sure if robot localization works with variations in pose, so the questions are: Is there any way I can fuse them with robot_localization? As the second source does not give a complete pose but a change in pose do I have to modify this data? Should I add all the previous estimations plus the current one to get a the "global" pose (instead of the variation) and send that one to EKF along with the IMU? Thank you. EDIT: I was trying to fuse just my estimation to see how it performed, this is the yaml code: And the main code is something like this: I've checked the Camera_estimation topic and the messages are being published, but odometry/filtered isn't publishing anything. |
2021-02-02 08:05:03 -0500 | received badge | ● Student (source) |
2020-12-27 20:29:50 -0500 | received badge | ● Famous Question (source) |
2020-12-21 22:56:21 -0500 | received badge | ● Notable Question (source) |
2020-11-11 14:03:52 -0500 | received badge | ● Famous Question (source) |
2020-10-18 15:18:26 -0500 | received badge | ● Famous Question (source) |
2020-10-02 21:24:33 -0500 | received badge | ● Famous Question (source) |
2020-08-31 03:46:59 -0500 | marked best answer | Robot_localization - Could not transform measurement into base_link Hello!!! I'm using robot_localization for 3D positioning and I'm giving it a try with one source. In this case, it is providing a TwistWithCovarianceStamped in odom frame and I'm trying to use the EKF just to see how it works, but it return a straight line. In the debug file it says
I'll add the params file: |
2020-08-29 10:47:34 -0500 | answered a question | Robot_localization - Could not transform measurement into base_link I finally opted for not simulating the camera alone and mixed it with the IMU, then it worked (better and IMU-only). Don |
2020-08-29 10:42:57 -0500 | received badge | ● Famous Question (source) |
2020-08-25 11:55:34 -0500 | received badge | ● Notable Question (source) |
2020-08-06 11:28:31 -0500 | received badge | ● Popular Question (source) |
2020-07-01 09:15:40 -0500 | received badge | ● Notable Question (source) |
2020-06-27 12:07:18 -0500 | commented question | Robot_localization - Could not transform measurement into base_link Try to change it, and see what happens. -> It returns a completely wrong output. I have not touched the urdf file nor |
2020-06-27 02:09:40 -0500 | commented question | Robot_localization - Could not transform measurement into base_link No, I used odom because the information given was in odom frame of reference. |
2020-06-26 11:33:31 -0500 | commented question | Robot_localization - Could not transform measurement into base_link It just says no tf data received. Shouldn't robot_localization provide this odom to base_link transform from the given d |
2020-06-26 11:21:42 -0500 | received badge | ● Popular Question (source) |
2020-06-26 05:07:22 -0500 | asked a question | Robot_localization - Could not transform measurement into base_link Robot_localization - Could not transform measurement into base_link Hello!!! I'm using robot_localization for 3D positi |
2020-06-26 01:22:28 -0500 | commented answer | Robot localization fusion I'll take a look at that link. The odometry data is nowhere near those extreme outputs values that I get through EKF. (I |
2020-06-25 13:49:34 -0500 | marked best answer | Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimation but, when I try to run the .launch file, it returns an error. Terminal output: This is the launch file, and it refers to a yaml (ekf_template.yaml) file with all the parameters: Yaml file: Ubuntu 14.04 ROS Indigo |
2020-06-25 13:47:59 -0500 | received badge | ● Notable Question (source) |
2020-06-24 02:12:25 -0500 | received badge | ● Popular Question (source) |
2020-06-22 09:24:53 -0500 | commented answer | Robot localization fusion If I use yours (taking into account that mine is not a 2D movement) it does work, but it returns an output whose displac |
2020-06-22 09:24:53 -0500 | received badge | ● Commentator |
2020-06-22 02:37:33 -0500 | received badge | ● Popular Question (source) |
2020-06-22 02:18:45 -0500 | commented answer | Robot localization fusion I've added the relevant parts of the code to the main question |
2020-06-22 02:17:56 -0500 | edited question | Robot localization fusion Robot localization fusion Hello!! I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source |
2020-06-21 09:36:00 -0500 | commented answer | Robot localization fusion I have tried it and looking at the debugging file it says that it cannot transform into base_link frame. The velocities |
2020-06-20 04:06:13 -0500 | commented answer | Robot localization fusion Then, I should divide the traslations/rotations by the time between the messurements and save those values in the TwistW |
2020-06-19 12:33:27 -0500 | asked a question | Robot localization fusion Robot localization fusion Hello!! I have to fuse two sources: IMU data (accelerometer and gyroscope) and another source |
2020-06-19 09:29:59 -0500 | received badge | ● Famous Question (source) |
2020-05-17 12:58:20 -0500 | asked a question | How to write my covariance matrix to PoseWithCovarianceStamped How to write my covariance matrix to PoseWithCovarianceStamped Hello! I want to write the covariance I have calculated i |
2020-05-17 12:44:11 -0500 | marked best answer | No odometry/filtered output from robot_localization Hello! I'm trying to fuse IMU and a camera estimation but I receive no output through /odometry/filtered. If I try with IMU alone, it does send infromation through that topic, but when I try with the camera estimation,I have no response from the EKF node. The camera sends a PoseWithCovarianceStamped message that tells the displacement between the previous frame and the current one in the UAV coordinate frame (called erlecopter/imu_link): where, the publisher node has been defined like this: This message is properly as I have seen thanks to rostopic echo. My parameters file for the EKF node is like this (for the camera only fusion): (more) |
2020-05-13 08:32:46 -0500 | commented answer | No odometry/filtered output from robot_localization I'll have to look further into how to create that TF tree, because I have no idea |
2020-05-11 17:24:58 -0500 | received badge | ● Notable Question (source) |
2020-05-11 06:23:13 -0500 | commented answer | No odometry/filtered output from robot_localization Failure at 1589196125.549205710 Exception thrown:"odom" passed to lookupTransform argument target_frame does not exist. |
2020-05-11 05:04:45 -0500 | commented question | No odometry/filtered output from robot_localization rqt_tf_tree says no data received. I have tried with rqt_graph and this is what I get: https://drive.google.com/file/d/1 |
2020-05-10 12:25:39 -0500 | received badge | ● Popular Question (source) |
2020-05-10 05:39:55 -0500 | edited question | No odometry/filtered output from robot_localization No odometry/filtered output from robot_localization Hello! I'm trying to fuse IMU and a camera estimation but I receive |
2020-05-09 11:36:36 -0500 | commented question | No odometry/filtered output from robot_localization It is not the pose of the camera but the displacement of the UAV between the previous measure and the current one (it is |
2020-05-08 08:33:16 -0500 | commented question | No odometry/filtered output from robot_localization Yes, Camera_estimation is the topic name, and I have modified its data so that instead of being with respect to the came |
2020-05-08 05:26:17 -0500 | received badge | ● Notable Question (source) |
2020-05-08 04:51:18 -0500 | asked a question | No odometry/filtered output from robot_localization No odometry/filtered output from robot_localization Hello! I'm trying to fuse IMU and a camera estimation but I receive |
2020-04-29 05:37:25 -0500 | edited question | Error when launching ekf_localization_node Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimat |
2020-04-29 05:37:16 -0500 | edited question | Error when launching ekf_localization_node Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimat |
2020-04-29 05:37:16 -0500 | received badge | ● Editor (source) |
2020-04-29 05:31:44 -0500 | asked a question | Error when launching ekf_localization_node Error when launching ekf_localization_node I'm trying to use robot_localization package to fuse IMU and a visual estimat |
2020-04-28 09:43:48 -0500 | received badge | ● Enthusiast |
2020-04-15 14:21:19 -0500 | received badge | ● Popular Question (source) |
2020-04-14 08:26:09 -0500 | marked best answer | Reference frames in robot_localization Hello! I've got a couple of questions:
Nontheless, in that same page, if you look at IMU information, they say:
In that case, if gravity is possitive, the Z vector is pointing down. Isn't this the opposite of what was stated above? Should my IMU gravity component be positive(NED) or negative(ENU)? .2 I am measuring the variation of the angles and the movement from images taken from the bottom camera of a UAV and fusing them with the IMU. IMU updates values each 2.5ms and the information from the image is extracted each ~100ms, so the pose information from the camera is given with respect to the previous frame (~100ms ago). Does the robot_localization takes into account that pose values are with respect to the previous meassure given by that sensor, or should I change something? Thank you in advance. |