ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

astaranowicz's profile - activity

2019-02-28 09:54:35 -0500 received badge  Great Answer (source)
2019-02-28 09:54:35 -0500 received badge  Guru (source)
2016-10-25 00:26:44 -0500 received badge  Nice Question (source)
2015-12-01 09:21:55 -0500 received badge  Critic (source)
2015-10-30 18:03:44 -0500 marked best answer Weird PR2 Image Generation on all cameras

Hi all,

I attached an image that I was getting from rviz of the PR2 looking at a bookshelf using the /head_mount_kinect/rgb/image_raw topic. As you can see, the kinect's RGB image is giving some pink blobs over it. Does anyone know what these pink blobs are? They don't seem to be any depth data or anything else.

image description

By the way, I'm using Ubuntu 12.04 with the latest ROS.

Also, I can't use any feature detector such as SIFT, SURF, GFT, etc. on these images or any camera that I'm bringing up from Gazebo. I don't know if it is a problem with Gazebo or with ROS.

Thanks

2015-06-13 20:25:59 -0500 received badge  Popular Question (source)
2015-06-13 20:25:59 -0500 received badge  Famous Question (source)
2015-06-13 20:25:59 -0500 received badge  Notable Question (source)
2015-03-06 10:33:40 -0500 received badge  Famous Question (source)
2014-11-06 20:12:19 -0500 received badge  Enlightened (source)
2014-11-06 20:12:19 -0500 received badge  Good Answer (source)
2014-10-28 03:28:48 -0500 received badge  Taxonomist
2014-01-28 17:30:20 -0500 marked best answer viso2_ros demo.launch

Hi all,

I'm trying out viso2_ros (http://www.ros.org/wiki/viso2_ros?distro=fuerte) by running the demo.launch file that was included. However, it requires a disparity_params.yaml for the stereo_image_proc node. I don't see an example of what is inside the disparity_params.yaml. I tried just putting in a camera calibration parameters. But that does not seem right.

Output from the terminal:

aaron@AntaresL:~/Desktop/DRC/ROS_Examples/viso2-fuerte/viso2_ros/launch$ roslaunch viso2_ros demo.launch ... logging to /home/aaron/.ros/log/6948e56a-ae8e-11e2-b858-180373ea7946/roslaunch-AntaresL-6453.log Checking log directory for disk usage. This may take awhile. Press Ctrl-C to interrupt Done checking log file disk usage. Usage is <1GB.

Invalid roslaunch XML syntax: not well-formed (invalid token): line 12, column 57

What is needed in the disparity_params.yaml to make this program work? Could someone show an example?

2014-01-28 17:29:23 -0500 marked best answer ccny_rgbd_tools launch files for PR2 in Gazebo

Hi,

I would like to use the ccny_rgbd_tools package on the PR2 in Gazebo using the head_mount_kinect.

What would the launch files I would need in order for this to work?
What would I need to modify in the current launch files?

Sorry, I'm still inexperience using the PR2 and its topics so I don't quite understand which topics will be required. However, I am still checking.

Thanks.

EDIT:

I updated the vo+mapping.launch file to remap the expected topics:

<node pkg="ccny_rgbd" type="visual_odometry_node" name="visual_odometry_node" output="screen">

<remap from="/rgbd/rgb" to="/head_mount_kinect/rgb/image_raw"/>
<remap from="/rgbd/depth" to="/head_mount_kinect/depth/image_raw"/>
<remap from="/rgbd/info" to="/head_mount_kinect/rgb/camera_info"/>

... rest of the node ... </node>

However, I get this error: (partial)

Frame /r_gripper_motor_slider_link exists with parent /r_gripper_palm_link. Frame /r_gripper_r_finger_link exists with parent /r_gripper_palm_link. Frame /r_shoulder_lift_link exists with parent /r_shoulder_pan_link. Frame /r_shoulder_pan_link exists with parent /torso_lift_link. Frame /r_wrist_flex_link exists with parent /r_forearm_link. Frame /torso_lift_motor_screw_link exists with parent /base_link. Frame /odom_combined exists with parent NO_PARENT.

Most of the links now say that there is NO_PARENT.

Now I know the ccny_rgbd_tools expects:

RGB Image (8UC3) topic Depth Image (16UC1, in millimeters) topic Camera Info topic

However, I don't know if I am using the right topics. Any suggestions?

EDIT_2:

Topics and Frames I am using: remap from="/rgbd/rgb" to="/head_mount_kinect/rgb/image_raw/compressed"

remap from="/rgbd/depth" to="/head_mount_kinect/depth/image_raw"

remap from="/rgbd/info" to="/head_mount_kinect/rgb/camera_info"

param name="publish_tf" value="false"

param name="fixed_frame" value="/odom"

param name="base_frame" value="/base_link"

2014-01-22 20:59:55 -0500 received badge  Famous Question (source)
2014-01-09 09:01:40 -0500 marked best answer Kinect extrinsic calibration (between depth and built-in cameras)

Hi all,

I am having some problems with this tutorial: http://www.ros.org/wiki/openni_launch/Tutorials/ExtrinsicCalibration

I calibrated both the RGB and the Depth camera on the Kinect, however, when I try to do this tutorial. I always get an error which says "Timed out waiting for checkerboard". I tried to find this error, however, I don't see anyone else having this problem.

I'm not sure if it is because the tutorial is broken for ROS fuerte. Anyone else having problems getting this to work?

Or if there is a better calibration method out there on ROS to calibrate the kinect. Note: I'm trying to get a better calibration than the manufacture. This is because I need more accuracy (millimeter range) than what is given (centimeter range).

EDIT: I tried to use the Contrast Augmentor using this command:

rosrun contrast contrast_augmenter image:=/camera/ir/image_raw

Then the input to the calibration was:

roslaunch camera_pose_calibration calibrate_2_camera.launch camera1_ns:=/camera/rgb_bag camera2_ns:=/camera/ir_augmented

Its the standard checkerboard so I should not need to give rows and columns. I did not have to when I did the calibration for the RGB Camera and IR Camera.

I still get the error on the screen "Timed out waiting for checkerboard". Am I missing something here?

EDIT_2:

I tried adding in the Checkerboard size and the number of columns and rows. Still can't be seen.

EDIT_3: image:

Image and video hosting by TinyPic

2014-01-09 09:01:40 -0500 received badge  Self-Learner (source)
2013-11-14 18:02:18 -0500 received badge  Popular Question (source)
2013-11-14 18:02:18 -0500 received badge  Notable Question (source)
2013-09-10 02:35:40 -0500 received badge  Popular Question (source)
2013-09-10 02:35:40 -0500 received badge  Notable Question (source)
2013-08-14 13:47:07 -0500 received badge  Famous Question (source)
2013-08-01 19:23:17 -0500 received badge  Famous Question (source)
2013-07-27 13:46:09 -0500 received badge  Famous Question (source)
2013-06-24 17:25:17 -0500 asked a question Kinect IR Image Flickers Openni_Launch

I updated openni_launch a couple days ago. It was apart of the suite of updates to ROS. However, now when I pull the IR image from the Kinect. The image flickers from a very low contrast to normal. Is there a fix for this or is it possible to revert back?

2013-06-13 07:12:17 -0500 commented answer Asus Xtion, Stream RGB and IR together

That is for external RGB with the Depth on the Kinect. Its not the same as with the on-board RGB camera and the Depth. The extrinsic calibration is quite bad doing the tutorial I posted, in which, I want to be able to move the checkerboard around the image instead of keeping it static.

2013-06-12 21:56:48 -0500 received badge  Notable Question (source)
2013-06-12 11:00:49 -0500 received badge  Popular Question (source)
2013-06-12 10:57:11 -0500 commented answer Asus Xtion, Stream RGB and IR together

How can http://www.ros.org/wiki/openni_launch/Tutorials/ExtrinsicCalibration be used efficiently? Should I write code that takes in the RGB and IR Image, synchronize the images by using the Time Synchronizer, then publish those synchronized images?

2013-06-12 06:49:19 -0500 asked a question Asus Xtion, Stream RGB and IR together

Hi,

I'm currently trying out the Asus Xtion Pro Live using ROS Fuerte on Ubuntu 12.04.

When I start the asus xtion using openni_launch, I get this error:

Cannot stream RGB and IR at the same time. Streaming RGB only.

I thought the Asus Xtion was different from the Kinect because the Xtion allows for RGB and IR Streaming at the same time. Why can I not stream them at the same time?

Should I use a different package or something? I did read though some of the ROS answers but it doesn't seem that anyone has tried this or it wasn't important enough. Is it possible to stream the RGB and IR together on the Asus Xtion?

Thanks

2013-06-06 05:51:31 -0500 commented question Does IMU data improve odometry in Gazebo?

The question is a bit unclear. Gazebo can give you a ground-truth pose with respect to the gazebo defined origin. If you are defining your own origin and using a package to calculate the odometry, then depending on the noise the IMU data can improve or degrade the odometry calculations.

2013-06-05 07:00:48 -0500 asked a question RVIZ Image Pixel Selection

Hi all,

I'm working with OpenCv and Ros-Fuerte. At the moment, I am displaying all of my images in an OpenCV window for pixel selections like the camshift_demo selection box. The images (cameras) are also published and viewed in RViz. Is there a plugin or code available to select pixels in a user defined region of interest in RViz?

Since most of the code publishes Images and Point cloud topics, I would like to get rid of the OpenCV window and just use RViz window to handle all of the selections.

2013-06-03 02:16:20 -0500 received badge  Notable Question (source)