ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

el_lobo's profile - activity

2014-05-29 02:40:55 -0500 received badge  Student (source)
2013-09-19 23:16:56 -0500 received badge  Famous Question (source)
2013-09-02 11:13:30 -0500 received badge  Notable Question (source)
2013-08-15 22:59:37 -0500 received badge  Famous Question (source)
2013-08-08 15:15:12 -0500 received badge  Favorite Question (source)
2013-08-06 19:23:24 -0500 received badge  Notable Question (source)
2013-08-06 09:27:27 -0500 received badge  Popular Question (source)
2013-07-15 23:27:47 -0500 asked a question robot_pose_ekf problems, timestamp, frames transformation

Hello,

I am using robot_pose_ekf to fuse data from IMU, encoders and gps. Filter is performing really bad. Results are far from being precise.

During the running of the node, i get several errors:

[ERROR] [1373966320.259468440]: Timestamps of odometry and imu are 1.463487 seconds apart.
[ERROR] [1373966703.475481395]: Could not transform imu message from base_link to base_footprint
[ERROR] [1373966703.750433325]: Timestamps of odometry and imu are 4.054829 seconds apart.
[ INFO] [1373966703.750547445]: Odom sensor not active any more
[ INFO] [1373966703.750629928]: GPS sensor not active any more
[ERROR] [1373966703.750895464]: TF_NAN_INPUT: Ignoring transform for child_frame_id "/base_footprint" from authority "default_authority" because of a nan value in the transform (409490.661049 5476131.598444 nan) (0.024942 -0.090772 0.019537 0.995368)
[ERROR] [1373966703.751012029]: Cannot get transform at time 0.000000
[ERROR] [1373966703.751104568]: Cannot get transform at time 0.000000
[ WARN] [1373966703.751197178]: TF to MSG: Quaternion Not Properly Normalized
[ WARN] [1373966703.751528436]: TF_OLD_DATA ignoring data from the past for frame /base_footprint at time 0 according to authority /robot_pose_ekf_cp
Possible reasons are listed at

Also, when I am using GPS sensor, Filter is performing really badly with great delays.

If anyone has idea how to fix timestamps, for the beginning, or if anyone got robot_pose_ekf running properly (I see there is a lot of issue with this one), please post here.

2013-07-10 04:52:11 -0500 received badge  Popular Question (source)
2013-06-19 20:11:20 -0500 answered a question Husky A200, robot_pose_ekf

I've tested it on different surfaces and output of robot_pose_ekf is not precise enough. With just encoder information, there is a big slip when turning. With combination of IMU and encoders there is a way to achieve better precision (change in covariance matrices) until certain level. It confuses me that covariance matrix of IMU is constant and element (3,3) (if you count from 1) has value 10^-6 while the rest are 10^6. If I element (2,2) of covariance matrix is set to 10-6 results are good, but still there is an error.

Also, it might be possible that there is certain level of precision which might be achieved with only these sensors, so I might just reached the maximum precision, but I'm unsure. Any other experiences?

2013-06-18 04:23:00 -0500 received badge  Editor (source)
2013-06-18 04:21:57 -0500 asked a question Husky A200, robot_pose_ekf

Hello,

I am working with Clearpath Husky. I would like to merge all sensors over robot_pose_ekf (encoders and IMU). However, when I drive a robot, output from the filter is not correct. It is mostly okay for x axis, but prediction of y axis is not precise at all (robot turning creates a mess).

I would like to ask does anyone here has experience with Husky and if yes, could you please recommend how to configure ekf? I have tried using different covariances, but results are confusing.

2013-05-29 09:46:20 -0500 received badge  Famous Question (source)
2013-05-23 21:08:48 -0500 commented answer robot_pose_ekf, sensor priority

P.S Also, do you perhaps know is multiplying the odom and imu covariances with same number equivalent to not multiplying them at all? This is basically the question about logic behind EKF. Are covariances, from different sources used in respect to each other, or?

2013-05-23 21:06:36 -0500 commented answer robot_pose_ekf, sensor priority

Thank you a lot. This really made my start with this node much easier. I would just like to clarify: Way to adjust covariances is this: imu_covariance_(i+1, j+1) =weight * ( imu->orientation_covariance[3*i+j]); ?

2013-05-22 05:12:34 -0500 received badge  Notable Question (source)
2013-05-22 04:35:52 -0500 commented answer robot_pose_ekf, sensor priority

Hmm, it seems to me that imu_covariance as well as orientation_covariance are 3x3 matrices. Can you please explain (4,4), (5,5), and (6,6) entries ?

2013-05-22 01:13:56 -0500 received badge  Popular Question (source)
2013-05-22 00:04:38 -0500 commented answer robot_pose_ekf, sensor priority

Thank you for the answer. However, I do not understand how practically to do so. Here is my assumption, if for example I want to use IMU: float weight = 1.2; imu_covariance_(i+1, j+1) =weight * ( imu->orientation_covariance[3*i+j]); Would this be the best way to do this?

2013-05-21 06:09:40 -0500 asked a question robot_pose_ekf, sensor priority

Hello,

I am using robot_pose_ekf for fusing data from encoder and IMU. I was wondering, where at the code I can find the section of actual fusing of data, and if someone is familiar with the process, could you please describe it here how does it happen?

Also, is there a way to specify priority of data source for fusion process? For example, I want to trust more to IMU data then to encoders?

Best regards.