ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

manuzagra's profile - activity

2015-10-06 08:37:50 -0500 received badge  Famous Question (source)
2015-10-02 04:55:09 -0500 commented answer GPS robot_localization point-to-point navigation

Thank you very much, I did esactly that and it is very unstable (in altitude, I don't know why utm frame is gloiung some kilometres down the earth) and not very precise (map goes far from odom even when the robot is static). I think my problem are my GPS, I am using the GPS of a movile and a tablet.

2015-10-01 14:45:37 -0500 received badge  Notable Question (source)
2015-09-29 16:52:59 -0500 received badge  Popular Question (source)
2015-09-28 09:50:20 -0500 asked a question GPS robot_localization point-to-point navigation

I started to work with robot_localization package for the odometry of my robot and currently the local odometry is working good (the transform "odom -> base_link" has low error). I need to do point-to-point navigation and I would like to use the same package to get the global positioning using a GPS.

I am going to upload the launch file that I use and a bag file. Te route recorded in the bag file is a straight line facing to the south approximately, the robot did a round trip and it finished in the same place that he started, the return way was done in reverse walking ( I didn't want to turn because my model of the robot for the odometry is not appropriate for this terrain). I would be very thankful if somebody can take a look at it and say me if this is a correct configuration and the output is correct. (launch and bag )

My problem is I do not quite understand the meaning of each frame and, for example, if i want to move the robot to a desired position (latitude and longitude), I do not know the position of the goal relative to my robot. I think I could transform the goal to UTM coordinates, but even with this, I can't realize from what frame to what frame I should make the transform. My principal idea is to transform the goal position to UTM and make a frame whose parent is the UTM frame published from the robot_localization package and work with it, but this frame is very far and (I think) imprecise that I do not know if it is the correct way to deal with the package.

May somebody explain the frames of this package and how could I use them to naviagete with my GPS? I would need the relative position of the goal refered to the position of the robot or the global position of both, robot and goal.

Thank you very much in advance.

2015-06-25 03:24:00 -0500 commented answer robot_localization

I am not sure if this should be here or in a upload of the question...

my IMU is rotated, it isn't in his neutral orientation, I am using a static transform to model this

2015-06-24 06:43:02 -0500 received badge  Scholar (source)
2015-06-24 06:34:31 -0500 received badge  Famous Question (source)
2015-06-18 04:52:44 -0500 received badge  Editor (source)
2015-06-18 02:59:24 -0500 received badge  Notable Question (source)
2015-06-17 13:04:01 -0500 received badge  Popular Question (source)
2015-06-17 09:26:15 -0500 asked a question robot_localization

I am trying to use robot_localization package to stimate the position of my robot. First I am going to explain what I have.

Sensors:

IMU publishing sensor_msgs::Imu publishing in /imu_rectificada topic. The IMU frame is ENU, 0 degrees when Y is in Norht and Z is up. I will fill the covariance once the node is working.

---
header: 
  seq: 191036
  stamp: 
    secs: 1434547057
    nsecs: 229000000
  frame_id: /imu
orientation: 
  x: -0.628702991434
  y: -0.315484655538
  z: 0.321512012361
  w: 0.633902097503
orientation_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
angular_velocity: 
  x: -0.0138241518289
  y: -0.0111814923584
  z: 0.0322032161057
angular_velocity_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
linear_acceleration: 
  x: -0.0442927330732
  y: -9.97963142395
  z: 0.102352119982
linear_acceleration_covariance: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
---

Wheel odometry publishing nav_msgs::Odometry in /odom topic. It starts at 0 position.

---
header: 
  seq: 110736
  stamp: 
    secs: 1434546465
    nsecs: 469210289
  frame_id: /odom
child_frame_id: /base_link
pose: 
  pose: 
    position: 
      x: 276.840007856
      y: 0.0
      z: 0.0
    orientation: 
      x: 0.0
      y: 0.0
      z: 0.0
      w: 1.0
  covariance: [1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0]
twist: 
  twist: 
    linear: 
      x: 0.1
      y: 0.0
      z: 0.0
    angular: 
      x: 0.0
      y: 0.0
      z: 0.0
  covariance: [1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0]
---

tf:

I publish the transform /odom -> /base_link only with the information of odometry, so it doesnt have any global information. I cloud change this with the information of the imu, but actually im not interested in global positioning.

Here my first question, where should I set the transform for the IMU, I am trying with a transform /base_link -> /imu where imu_frame is just base_link_frame but with the rotation reported from the IMU.

Q1: Is it good the transform I am publishing with the odometry? Where should I set the transform for the IMU?

My launch file is:

<launch>
    <node pkg="robot_localization" type="ukf_localization_node" name="ukf_localization" clear_params="true">

      <!-- My odometry publish at this frequency -->
      <param name="frequency" value="10"/>
      <param name="sensor_timeout" value="0.15 ...
(more)
2015-06-17 08:06:24 -0500 received badge  Supporter (source)
2015-06-16 09:54:58 -0500 received badge  Enthusiast