ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Jim Rothrock's profile - activity

2021-01-03 12:19:28 -0500 received badge  Nice Question (source)
2020-07-02 20:07:32 -0500 edited answer Mysterious state inputs for ekf_localization_node

I could not get ekf_localization_node to repeat this behavior, so I am deleting this issue.

2020-07-02 20:05:13 -0500 answered a question Mysterious state inputs for ekf_localization_node

I could not get ekf_localization_node to repeat this behavior, so I am closing this issue.

2020-07-02 20:05:13 -0500 received badge  Rapid Responder (source)
2020-07-01 19:52:17 -0500 asked a question Mysterious state inputs for ekf_localization_node

Mysterious state inputs for ekf_localization_node The ekf_localization_node is created in my launch file like this: &l

2020-03-20 13:11:06 -0500 received badge  Good Answer (source)
2019-12-11 17:44:23 -0500 received badge  Good Question (source)
2019-10-03 14:46:56 -0500 received badge  Famous Question (source)
2018-07-16 22:07:40 -0500 received badge  Famous Question (source)
2018-07-16 22:02:26 -0500 received badge  Good Question (source)
2017-10-29 12:09:43 -0500 received badge  Good Answer (source)
2017-08-08 09:08:25 -0500 received badge  Nice Answer (source)
2016-03-14 03:28:48 -0500 received badge  Nice Question (source)
2016-03-14 03:26:43 -0500 received badge  Notable Question (source)
2016-03-13 01:53:17 -0500 received badge  Notable Question (source)
2016-03-10 14:14:07 -0500 received badge  Great Answer (source)
2015-11-19 04:00:00 -0500 received badge  Popular Question (source)
2015-11-12 19:27:11 -0500 received badge  Nice Answer (source)
2015-11-03 14:25:45 -0500 answered a question Using wheel encoders and IMU with robot_localization

The robot_localization path diverges from the encoder path because the robot_localization path includes data from the IMU. The robot_localization path might actually be closer to the robot's true path than the encoder path. Generally, the odometry path generated by wheel encoders will diverge from a robot's true path by some amount, and the path generated from IMU measurements will diverge by some other amount. robot_localization merges the data from the encoders and the IMU, generating a path that, ideally, is closer to the true path than the encoder or IMU paths alone.

2015-10-30 02:59:40 -0500 received badge  Popular Question (source)
2015-10-29 19:46:13 -0500 commented question Using wheel encoders and IMU with robot_localization

"...they seem off under the settings shown in the launch file..."

By "off," are you referring to the fact that the paths diverge?

2015-10-29 19:32:00 -0500 asked a question What is the purpose of the architecture_independent tag?

I've read the description in REP 140, but I'm still not clear about the meaning of architecture_independent My interpretation was that the package contains no files that require a particular CPU architecture. An example of a non-architecture-independent package would be one that contains C++ source code that accesses data in a manner that requires a little-endian CPU architecture. Since none of my C++ code is that architecture-specific, I've been specifying architecture_independent in my package.xml files. After reading some discussions about architecture_independent, though, I now suspect that it means "This package generates no architecture-specific object files at build time." I'd be grateful if someone can provide clarification.

2015-10-12 09:30:12 -0500 received badge  Famous Question (source)
2015-09-28 01:24:12 -0500 received badge  Notable Question (source)
2015-09-21 16:33:16 -0500 received badge  Popular Question (source)
2015-09-20 13:11:39 -0500 asked a question Problem with running Gazebo in a Docker container

I want to be able to test my robot code with Gazebo on both ROS Indigo and Jade on the same machine. Indigo and Jade use versions of Gazebo that cannot be installed simultaneously, so I'm trying to do my Jade testing in a Docker container. I've installed the Jade image like this:

docker pull osrf/ros:jade-desktop-full

I then run a Bash shell in a Docker container:

docker run -it osrf/ros:jade-desktop-full bash

In the shell, I launch Gazebo:

roslaunch gazebo_ros empty_world.launch

I get this error:

/opt/ros/jade/lib/gazebo_ros/gzserver: 22: .: /opt/ros/jade/lib/gazebo_ros/gzclient: 17: .: Can't open /share/gazebo//setup.sh

This problem seems to be caused by the lack of a gazebo,pc file in the container's /usr/lib/pkgconfig directory. Is there some step that I missed, regarding running Gazebo in a Docker container? I'm aware that I'll have to do more set-up in order to allow Gazebo to access the X server, but right now I just want to solve the gazebo.pc problem.

2015-09-07 19:06:05 -0500 answered a question Planner for Car-Like Robot

I suggest writing global and local planners that support an Ackermann vehicle. You'll have to extend move_base so that it can publish AckermannDriveStamped messages to ackermann_cmd instead of publishing Twist messages to cmd_vel. You might want to write some recovery behaviors that will work with an Ackermann vehicle.

2015-08-18 08:41:46 -0500 marked best answer latch=True in rospy.Publisher() appears to have no effect

I have a joint which I want to move to a position when the robot is initialized. The joint is never moved again. Therefore, I do this:

self._pub = rospy.Publisher("joint/command", Float64, latch=True)
self._pub.publish(45.0)

However, the subscriber (a gazebo_ros_control joint position controller) does not move the joint unless I add a rospy.sleep(1) call between the two statements above. It is my understanding that latch=True should result in the 45.0 message being stored, ready for any future subscribers to joint/command. Is this a bug, or is there something about latching that I don't understand? I'm using ROS Hydro on Ubuntu 12.04 LTS.

2015-08-11 07:54:39 -0500 received badge  Nice Answer (source)
2015-08-10 13:21:21 -0500 answered a question How to use robot_localization with ar_track_alvar?

But how to come up with realistic covariance information

Just set the covariance matrix diagonal values to some small value, such as 0.0001, and set the non-diagonal values to 0.0. Alternatively, you can set all the matrix values to 0.0, and ekf_localization_node will automatically set them to a small, but nonzero, value.

2015-08-05 11:58:23 -0500 answered a question Where to find good Dynamixel models (license)?

I don't know how they are licensed, but CAD files are available at http://en.robotis.com/BlueAD/board.ph... .

2015-08-05 11:41:32 -0500 commented answer ackermann_vehicle_description does not contain the URDF file

I'd also suggest adding the architecture_independent tag to the export

Only the metapackage package.xml doesn't have such a tag. I didn't think it was intended for metapackages, but http://www.ros.org/reps/rep-0140.html... says otherwise. I'll add the tag.

2015-08-05 11:19:39 -0500 commented answer ackermann_vehicle_description does not contain the URDF file

Looking at the history of the CMakeLists.txt in ackerman_vehicle_description that seems unlikely

I think what has been happening is that users have been forking the Github repository, so I've never noticed the CMakeLists.txt problem.

2015-07-28 01:22:53 -0500 asked a question ackermann_vehicle_description does not contain the URDF file

I distributed the ros-indigo-ackermann-vehicle, ros-indigo-ackermann-vehicle-description, and ros-indigo-ackermann-vehicle-gazebo Ubuntu packages months ago. It's been brought to my attention that when these packages are installed, most of their files are missing. For example, /opt/ros/indigo/share/ackermann_vehicle_description contains only package.xml and a cmake directory. I think that distribution of ackermann_vehicle as Ubuntu packages used to work. Here is a link to my ackermann_vehicle and ackermann_vehicle-release Git repositories: https://github.com/wunderkammer-labor... . I'd appreciate it if someone could let me know why correct Ubuntu packages are not being built.

2015-06-17 05:24:15 -0500 received badge  Famous Question (source)
2015-05-02 14:00:24 -0500 marked best answer Can the camera position be specified in an SDF file?

I am converting a Gazebo 0.x world file into Gazebo 1.x SDF format. In the original file, I was able to specify the initial camera position like this:

<rendering:gui>
  <frames>
    <row height="100%">
      <camera width="100%">
        <xyz>85.52 3.6 1.92</xyz>
        <rpy>0 23.8 69.9</rpy>
      </camera>
    </row>
  </frames>
</rendering:gui>

Does SDF provide similar functionality? I can't find any in the SDF documentation.

2015-04-21 06:21:44 -0500 received badge  Notable Question (source)
2015-04-20 21:05:30 -0500 answered a question ekf_localization_node exhibits unexpected behavior

none of your sensor data has any covariance values filled out

Each sensor now provides variance values on the diagonal of its covariance matrix. The off-diagonal entries are zero. This had no noticeable effect.

you don't have anything measuring Z or Z velocity, which is a big problem.

The wheel encoder odometry now provides the z velocity, which is always zero. This had no noticeable effect.

if it still misbehaves, try feeding some absolute orientation data to your odom instance of ekf_localization_node

This resulted in the same translation discontinuity that occurred when the absolute orientation data was provided by the localization instance of ekf_localization_node.

I fixed the problem by replacing the localization ekf_localization_node instance with a nodelet that subscribes to odometry/filtered and the pose provided by the solar compass. The nodelet accumulates the twist data from odometry/filtered, replaces the orientation component of the estimated pose when a pose message arrives, and publishes the map-to-odom transform.

2015-04-13 21:24:20 -0500 answered a question Why are my comments on ROS Answers disappearing?

The same thing is happening to me.

2015-04-13 21:11:52 -0500 received badge  Popular Question (source)
2015-04-10 14:30:50 -0500 asked a question ekf_localization_node exhibits unexpected behavior

I have two instances of ekf_localization_node. The first, used for odometry, reads the x velocity (in the base_link frame) from the wheel encoder odometry topic and publishes the odom-to-base_link transform. The second node, used for absolute orientation, reads the x velocity in the same manner as the first node, but also reads the roll, pitch, and yaw angles from the solar compass topic (a PoseWithCovarianceStamped message). The absolute orientation node publishes the map-to-odom transform. Furthermore, the absolute orientation node does not receive poses from the solar compass when the compass is in shadow.

What I am doing:

  1. Drive the robot forward until it is in shadow.
  2. Make a 90-degree right turn.
  3. Drive the robot forward until it is out of shadow.

I am using RViz to display odometry/filtered (published by the odometry ekf_localization node), with the fixed frame set to "map".

What I expect:

  1. The odometry display draws arrows that correspond to the robot's path as it moves into shadow.
  2. The 90-degree right turn adds no arrows to the odometry path. This is due to the fact that the robot is not translating or, from the absolute orientation node's point of view, rotating. Since the solar compass is in shadow, it does not publish any poses that would be received by the absolute orientation node.
  3. When I resume driving the robot forward, arrows should continue to be added to the odometry path, as if the right turn never took place. When the robot moves out of shadow, arrows added after leaving the shadow should be oriented 90 degrees right, relative to the earlier arrows. The new arrows should form a straight line pointing right.

From above, the result should look like this:

X>>>>>>>>>>
^
^
^
R->->->->->->
^
^
E
^
^
^

E is the point where the robot enters the shadow, R is where the right turn is made, and X is the point where the robot leaves the shadow. The "->" symbols show the actual path of the robot after the right turn.

What actually happens:

X
^
^
^
R->->->->->->
^
^
E______Y>>>>>>>>>>
^
^
^

The underscores are supposed to be spaces, but I couldn't get non-breaking spaces to work.

When the robot exits the shadow at point X, it is then translated to point Y. It appears that the robot's path between E and X is rotated, instead of just the robot itself. I suspect that the sum of all the position changes caused by x velocity messages since the last pose (orientation) message are being rotated around the position that was current when the last pose message was received. Is this expected behavior? I am using the robot_localization package from the ROS Indigo binary release.

2015-03-30 12:34:07 -0500 answered a question Cannot find controller (following PR2/Gazebo/Quick Start recipe)

Here is a working example from my robot. This is the arm section of the robot's ctrlr_params.yaml file:

arm_ctrlr:
  type: position_controllers/JointTrajectoryController
  joints:
    - arm_base_joint
    - shoulder_joint
    - elbow_joint
    - wrist_roll_joint
    - wrist_pitch_joint
    - wrist_yaw_joint
  stop_trajectory_duration: 0.1
  state_publish_rate: 1.0

Here is the transmission definition for arm_base_joint from the robot's URDF file:

  <transmission name="arm_base_trans">
    <type>transmission_interface/SimpleTransmission</type>
    <joint name="arm_base_joint">
      <hardwareInterface>PositionJointInterface</hardwareInterface>
    </joint>
    <actuator name="arm_base_act">
      <mechanicalReduction>1</mechanicalReduction>
    </actuator>
  </transmission>