ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

rbtying's profile - activity

2012-09-04 22:51:29 -0500 received badge  Famous Question (source)
2012-08-20 07:11:10 -0500 received badge  Famous Question (source)
2012-03-28 19:55:30 -0500 received badge  Notable Question (source)
2012-02-15 18:55:50 -0500 received badge  Notable Question (source)
2011-10-20 08:36:15 -0500 received badge  Popular Question (source)
2011-07-30 21:16:24 -0500 received badge  Popular Question (source)
2011-05-19 13:34:48 -0500 answered a question Calibrating Kinect with surroundings

Try making a URDF model and joint-publisher for your robot arm - the tf library will then take care of whatever transformations you need.

URDF tutorials

2011-05-12 01:34:29 -0500 answered a question roslaunch using find packages on different machines looks for packages on local machine

This is expected behavior: the roslaunch parameters are evaluated on the local computer, before being copied over SSH to a launch file on the remote machine (and thereafter executed).

2011-04-26 06:04:05 -0500 marked best answer How do you configure uvc_camera?

It looks like you are setting the size of the image to something different than what you calibrated with. The lifecam.ini is calibrated for 640x480, but you have the camera configured for 320x240. The camera calibration size needs to match the size of the image; the camera_info_manager doesn't handle the current image size being different than the calibrated size if I recall correctly.

Try setting the camera to 640x480 and see if the problem still occurs. If not, try setting the resolution to 320x240 and then recalibrating and using the new file. A shortcut could be to just edit your calibration's width and height values, but there is no guarantee that the calibration will still be correct at the different image size.

2011-04-24 15:47:41 -0500 commented answer How install SwissRanger?
Since I don't actually have a Swissranger camera, I can't tell you for sure, but I think you should try connecting it with USB first, and then figuring out how to migrate it to ethernet.
2011-04-23 06:35:33 -0500 received badge  Nice Answer (source)
2011-04-21 14:38:45 -0500 received badge  Teacher (source)
2011-04-21 04:03:35 -0500 answered a question How install SwissRanger?

Take a look at the Tutorial, it goes through all the steps of setting up the driver.

As a quick note, most ROS stacks don't have a download link - you're going to be expected to download the source (via svn or hg or git) and compile (or cross-compile) it to your particular system.

2011-04-21 03:53:04 -0500 commented answer How do you configure uvc_camera?
Thanks! I edited my question above to show the newer data - while your suggestions definitely make sense, they don't appear to have solved the problem. Are there any other debug files I should be looking at?
2011-04-20 15:01:10 -0500 commented answer How do you configure uvc_camera?
It's probably worth noting that it did accept the calibration from the cameracalibrator.py's 'commit', it just didn't keep it after the instance was killed | also that NONE of the params are being set - changing the frame doesn't work, nor the dimensions, nor the framerate.
2011-04-20 14:59:41 -0500 commented answer How do you configure uvc_camera?
I tried setting width/height to 640x480, as well as changing the ini file's parameters, but there's no change: the camera_info still says 640x480 (regardless of what I set height/width to in both the launch and ini files), and the other values are all zero.
2011-04-20 14:38:06 -0500 commented question How do you configure uvc_camera?
Yes, I did - see above for the lifecam.ini
2011-04-20 12:12:25 -0500 asked a question How do you configure uvc_camera?

I'm trying to get my camera_node working, and it doesn't seem to be taking the parameters given into account at all.

My launch file (excerpt):

    
  <group ns="lifecam">
    <node name="lifecam" pkg="uvc_camera" type="camera_node" output="screen">
           
    </node>
  </group>

Output from rostopic echo /lifecam/camera_info

header: 
  seq: 463
  stamp: 
    secs: 1303400952
    nsecs: 715715842
  frame_id: camera
height: 480
width: 640
distortion_model: ''
D: []
K: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
R: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
P: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
binning_x: 0
binning_y: 0
roi: 
  x_offset: 0
  y_offset: 0
  height: 0
  width: 0
  do_rectify: False
---

Clearly the calibration data is not being read (thus a bunch of zeros in the camera_info message), and the width/height parameters are the defaults, not the values I set them to.

I'm not sure what I'm doing wrong here: I can view the image stream in rviz, but rviz will complain about NaNs and infs in the calibration data (since it's not being read). Help would be greatly appreciated.

lifecam.yaml

$ cat rover/lifecam.yaml 
---
image_width: 640
image_height: 480
camera_name: camera
camera_matrix:
  rows: 3
  cols: 3
  data: [661.285, 0, 391.756, 0, 650.523, 205.652, 0, 0, 1]
distortion_model: plumb_bob
distortion_coefficients:
  rows: 1
  cols: 5
  data: [0.125743, 0.237875, -0.0083126, 0.0506531, 0]
rectification_matrix:
  rows: 3
  cols: 3
  data: [1, 0, 0, 0, 1, 0, 0, 0, 1]
projection_matrix:
  rows: 3
  cols: 4
  data: [661.285, 0, 391.756, 0, 0, 650.523, 205.652, 0, 0, 0, 1, 0]

I just updated the above data as per Markus Bader's recommendations - doesn't seem to have worked. The camera was recalibrated, and it appears that it is certainly not reading the params set: when set_camera_info is called, it saves the data into /tmp/calibration.yaml, not the file set in the launchfile. Thanks for all the help!

-- EDIT -- This seems to be some sort of a weird rosparam problem - I just tried to set the parameters by editing the defaults in /opt/ros/diamondback/stacks/camera_umd/uvc_camera/src/camera.cpp, and calibration data and everything work from there.

2011-03-13 18:44:15 -0500 marked best answer rviz odometry is doubling angle?

I think that you may be confusing your datatypes.

RVIZ takes in a pose with (x,y,z) position and (x,y,z,w) orientation. The orientation is represented by a quaternion. For a quick refresher on quaternions, check out the Wikipedia page.

Probably what is happening is that you are incorrectly publishing your quaternions.

If you have a given set of rotations, in roll/pitch/yaw, you can convert these to a quaternion using the tf::transformations::quaternion_from_euler function. By default it takes the euler angles in sxyz format (roll, pitch, then yaw).

For a bit of example code (in Python):

quat = tf.transformations.quaternion_from_euler(0,0,theta)
### Insert math into Odom msg so it can be published
odom_msg = Odometry()
odom_msg.pose.pose.orientation.x = quat[0]
odom_msg.pose.pose.orientation.y = quat[1]
odom_msg.pose.pose.orientation.z = quat[2]
odom_msg.pose.pose.orientation.w = quat[3]

This is for a differential drive robot, where there is no roll and pitch, and the kinematics are calculated shortly before this snippet of code.

So as an example:

 >>> tf.transformations.quaternion_from_euler(0,0,0)
 array([ 0.,  0.,  0.,  1.])
 >>> tf.transformations.quaternion_from_euler(0,0,math.pi/2)
 array([ 0.        ,  0.        ,  0.70710678,  0.70710678])
 >>> tf.transformations.quaternion_from_euler(0,0,2*math.pi)
 array([ -0.00000000e+00,   0.00000000e+00,   1.22460635e-16, -1.00000000e+00])

So, just keep in mind that all parts of the quaternion (x,y,z,w) have to be filled out, not just orientation.z.

2011-03-13 18:44:06 -0500 marked best answer rviz odometry is doubling angle?

Your code looks fine. The problem here is your interpretation of the "orientation" -- which is in fact a quaternion, not euler angles. Therefore, orientation.z is not the rotation about the z axis, but rather a component of the quaternion. For instance, using the tf.transformations module in python, we can see that when your marker is perpendicular (euler rotation about z of 1.57 radians), we have:

quaternion_from_euler(0,0,1.57) = array([ 0. , 0. , 0.70682518, 0.70738827])

Quaternion components are labeled (x,y,z,w), so the quaternion's "z" component will be 0.707 as you saw.

If you run the following (while your node is publishing tf messages):

rosrun tf tf_echo odom base_link

You can see the transformation in both euler angle and quaternion form.

2011-03-13 18:44:06 -0500 received badge  Scholar (source)
2011-03-13 18:38:08 -0500 received badge  Supporter (source)
2011-03-13 18:37:58 -0500 commented answer rviz odometry is doubling angle?
Thanks! That makes sense - I must have my odometry calibrated wrong then (short of rederiving the kinematics, I guess I'll just use a fudge factor of 2.0). I've never really worked with quaternions before, so I was confused.
2011-03-13 14:46:24 -0500 commented answer rviz odometry is doubling angle?
I added the relevant code (C++) to the question - am I making a mistake?
2011-03-13 14:45:10 -0500 received badge  Editor (source)
2011-03-13 09:54:49 -0500 asked a question rviz odometry is doubling angle?

Hello all, I'm having some problems visualizing odometry with rviz. The problem I am encountering is that rviz appears to be doubling the angle given by the odometry message published:

When the orientation.z of the odometry message is 0.00 (as viewed by rostopic echo odom), the arrow in rviz is aligned vertically with a gridline, as expected

When the orientation.z of the odometry message is 1.58, the arrow in rviz is anti-parallel to the 0.00 arrow from before (~180 degree rotation = ~3.14 radians is approx. 2 * 1.58 radians)

When the arrow on rviz is perpendicular to the initial 0.00 arrow, the orientation.z reads about .707 radians = ~40 degrees.

I'm not too sure if this is a configuration error or not - I originally thought I had written my odometry publisher incorrectly until I started echoing the topic.

Any help would be greatly appreciated.

Relevant Code:

                //since all odometry is 6DOF we'll need a quaternion created from yaw
                geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(
                                bot->m_odometry_yaw);

                //first, we'll publish the transform over tf
                geometry_msgs::TransformStamped odom_trans;
                odom_trans.header.stamp = current_time;
                odom_trans.header.frame_id = "odom";
                odom_trans.child_frame_id = "base_link";

                odom_trans.transform.translation.x = bot->m_odometry_x;
                odom_trans.transform.translation.y = bot->m_odometry_y;
                odom_trans.transform.translation.z = 0.0;
                odom_trans.transform.rotation = odom_quat;

                //send the transform
                odom_broadcaster.sendTransform(odom_trans);

                //next, we'll publish the odometry message over ROS
                nav_msgs::Odometry odom;
                odom.header.stamp = current_time;
                odom.header.frame_id = "odom";

                //set the position
                odom.pose.pose.position.x = bot->m_odometry_x;
                odom.pose.pose.position.y = bot->m_odometry_y;
                odom.pose.pose.position.z = 0.0;
                odom.pose.pose.orientation = odom_quat;

                //set the velocity
                odom.child_frame_id = "base_link";
                odom.twist.twist.linear.x = bot->m_velocity_x;
                odom.twist.twist.linear.y = bot->m_velocity_y;
                odom.twist.twist.angular.z = bot->m_velocity_yaw;

                //publish the message
                odom_pub.publish(odom);