ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

P.Naughton's profile - activity

2017-04-11 03:44:35 -0500 received badge  Nice Question (source)
2013-07-25 18:53:17 -0500 commented answer How to interpate the IMU raw data?

There is code to do this conversion posted in this ros answers page http://answers.ros.org/question/11545/plotprint-rpy-from-quaternion/#17106

2013-05-07 06:59:17 -0500 commented question RGBD Slam Stereo Setup

I have been removed from RGBD slam for a while, so sorry if I lead you in a wrong direction. I think that your problem might be in the point cloud topic. Are you using the stereo_img_proc node? I think that using the /disparity topic worked for me.

2013-05-06 16:24:14 -0500 commented question Prosilica Stereo Calibration Epipolar error hardware sync

I am interested in how you sent the time stamps, can you post this code? As for the rest of your question I am a little bit unsure of what you are trying. Are you using the stereo calibration toolbox with a checkerboard?

2013-05-06 16:20:38 -0500 commented question No Images from Procilica GC655C

I think we are going to need more information then that. Can you post the launch file? Are you able to open an image stream using the prosilica_gige_sdk sample viewer?

2013-04-11 14:48:22 -0500 commented question rosserial for Mbed on ROS Groovy - [solved]

you must replace "hg" with "git" to clone a git repository. hg was the command for the old repository location

2013-01-10 20:55:38 -0500 received badge  Famous Question (source)
2012-11-24 14:16:03 -0500 received badge  Famous Question (source)
2012-11-02 06:55:25 -0500 commented question Missing resource tf

I believe it is in the full desktop install of ROS. Is this how you installed ROS? For me tf is located at /opt/ros/fuerte/stacks/geometry/tf, I would check if that is the case for you. If it is, add /opt/ros/fuerte/stacks to your package path and rosdep should be able to find it

2012-10-29 19:19:58 -0500 received badge  Notable Question (source)
2012-10-15 09:26:50 -0500 received badge  Notable Question (source)
2012-10-15 03:47:48 -0500 received badge  Popular Question (source)
2012-10-14 13:33:21 -0500 received badge  Student (source)
2012-10-14 12:52:04 -0500 asked a question Colored Octomaps

There used to be an experimental octomap stack that allowed the user to generate colored octomaps. Does anyone still know how to access this and install it?

The only information I can find is in the post: http://answers.ros.org/question/12883/problems-building-and-using-octomap-with-rgbdslam/ Apparently the a tar ball is no longer available that was downloaded by the makefile of the experimental stack. However, from felix's post it looks like it is possible to get this working again since the octomap library should have support for the colored octomaps.

Has anyone tried and/or succeeded at this?

PS I am currently running fuerte and can generate octomaps without the color information. It would be awesome if I could get them in color!

Thanks!

2012-10-03 07:10:07 -0500 received badge  Famous Question (source)
2012-09-24 15:22:49 -0500 received badge  Notable Question (source)
2012-09-24 11:00:20 -0500 received badge  Popular Question (source)
2012-09-20 12:52:02 -0500 asked a question RGBDSLAM with stereo & IMU data

I would like to use RGBDSLAM using stereo cameras and an IMU for odometry. Following these questions:

http://answers.ros.org/question/42947/how-to-combine-robot-odometry-with-rgbd-slam-algorithm-for-mapping/ & http://answers.ros.org/question/41266/rgbd-slam-and-stereo/ & http://answers.ros.org/question/33175/replacing-gmapping-with-rgbdslam/

I have set up my launch file so that I have the left camera image(from stereo_image_proc) as "wide_topic" and points2 as "wide_cloud_topic"

Also since I am not using the openni_driver I have the following configuration:

I have played around with these parameters. It seems that no matter what I do I receive warnings from rgbdslam that are caused by not being able to look up transforms, i.e.

Lookup would require extrapolation into the past. Requested time 1347501971.980839144 but the earliest data is at time 1348181126.395687059, when looking up transform from frame [/camera_optical_frame] to frame [/base_link]

This is always followed by the warning:

Using Standard kinect /openni_camera -> /openni_rgb_optical_frame as transformation.

The map that is generated is obviously poor and I believe it is because of this transformation.

I am new to the concept of tf. I have gone through the tutorials but still feel a little bit shaky on how it works. I have built a robot model that has the following tf tree:

Link: base_link has 1 child(ren) child(1): chassis child(1): left_camera child(1): left_camera_lens child(1): camera_optical_frame child(2): right_camera child(1): right_camera_lens

I have not begun to integrate robot_pose_ekf because I do not feel that I have this part of the process down.

Can someone explain to me how the tf tree should look? How is the visual odometry broadcasted by rgbdslam? Can this information be integrated into robot_pose_ekf? and can IMU odometry be considered by rgbdslam?

Any pointers, links to other pages or code would be helpful.

Thanks for your time

2012-09-16 21:46:40 -0500 received badge  Popular Question (source)
2012-09-10 15:24:08 -0500 received badge  Supporter (source)
2012-08-15 08:39:05 -0500 asked a question Prosilica Hardware Synchronization

Goal: I am trying to create a stereo set up with two Prosilica GC750C cameras. I would like to collect the stereo data and store it into a bag file to post process later. I am limited by the quality of cabling and processing power. I am running an Intel Atom processor and have underwater cabling.

Problem: My problem is that I cannot get synchronized image capture. I have my cameras hardwired so that the syncin2 of one camera is connected to the syncout2 of the other camera 1. I have tried various configurations in ROS and I feel that the most promising is to run one camera in the "streaming" mode and have the other in "syncin2" and receive triggers from the streaming camera.

Using the Sample Viewer I currently have the "syncout" set to strobe1 which is set up to reflect when the camera is exposing(I have read that the FrameTrigger pulse is too short to trigger the camera and have had no luck getting the FrameTrigger to work with the strobe feature). The Strobe is used to limit the pulse length of the exposing trigger. While I have played around with the settings of SyncOut I am not sure that "exposing" is what I am looking for.

Under this setup I frequently get a scenario where the "streaming" camera logs frames twice as fast as the synced camera. I believe that this is because the "synced" camera is still capturing when the "streaming" camera sends its next trigger. Significantly reducing the max exposure time of the synced camera eliminates this problem, yet I feel that this is a poor hack to this problem. Also, using rxbag I can see that the frames coming from the "synced" camera are logged a little bit later. I am not sure if this is a result of triggering the camera with "exposing" and not "FrameTrigger" or a result of how the frames are time stamped, as described in ticket 4636

I feel that the best thing for me to do would be to try to have the streaming camera set to a fixed rate. That way I could insure an adequate amount of time between the triggers of the "syncin2" triggered camera so that camera can finish exposing before the next trigger is received. I understand that the ROS drivers do not have this functionality yet but that it is something that prosilica API does. Has anyone played with this?

Has anyone achieved hardware synchronization with prosilica cameras? Is your setup(hardware/software) similar to what I described or am I on the wrong track? Any suggestions on what I should try?

Thanks!