Ask Your Question

pulver's profile - activity

2019-06-03 09:26:18 -0500 answered a question How to make depth stream video

Hi, I know it was long time ago but did anyone found a solution? I am struggling trying to save the video stream from t

2019-02-08 10:35:11 -0500 received badge  Necromancer
2019-02-08 10:33:31 -0500 answered a question ROS-Gazebo Configuration for Deterministic Simulation

Hi, I know I am two years late but I am still interested in the topic since I am using ROS/Gazebo for some reinforceme

2017-10-02 03:54:00 -0500 received badge  Famous Question (source)
2016-10-30 17:30:06 -0500 received badge  Necromancer (source)
2016-10-30 17:30:06 -0500 received badge  Teacher (source)
2016-07-10 13:09:07 -0500 received badge  Notable Question (source)
2016-05-26 12:56:03 -0500 commented answer Is there any way to get QT Creator to show all of a project’s subdirectories?

It works! Thank you!

2016-04-24 12:00:27 -0500 received badge  Popular Question (source)
2016-04-14 09:56:54 -0500 answered a question how do i change topic name for ar_pose?

Hi, this is my launch file (working). Remember to calibrate the drone's camera:

<launch>
    <!-- IPv4 address of your drone -->
    <arg name="ip" default="192.168.1.1"/>

    <!-- Ultrasound frequency (7 or 8). -->
    <arg name="freq" default="8"/>
    <node name="ardrone_driver" pkg="ardrone_autonomy" type="ardrone_driver" output="screen" clear_params="true" args="-ip $(arg ip)">
        <param name="outdoor" value="0"/>
        <param name="max_bitrate" value="1000"/>
        <param name="bitrate" value="1000"/>
        <param name="navdata_demo" value="0"/>
        <param name="flight_without_shell" value="0"/>
        <param name="altitude_max" value="4000"/>
        <param name="altitude_min" value="50"/>
        <param name="euler_angle_max" value="0.21"/>
        <param name="control_vz_max" value="700"/>
        <param name="control_yaw" value="1.75"/>
        <param name="detect_type" value="10"/>
        <param name="enemy_colors" value="3"/>
        <param name="detections_select_h" value="32"/>
        <param name="detections_select_v_hsync" value="128"/>
        <param name="enemy_without_shell" value="0"/>
        <param name="ultrasound_freq" value="$(arg freq)"/>
        <param name="realtime_navdata" value="true"/>
        <param name="realtime_video" value="true"/>
        <!-- Covariance Values (3x3 matrices reshaped to 1x9)-->
        <rosparam param="cov/imu_la">[0.1, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.1]</rosparam>
        <rosparam param="cov/imu_av">[1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]</rosparam>
        <rosparam param="cov/imu_or">[1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 100000.0]</rosparam>

        <!-- Remapping for perceiving markers -->
        <remap from="ardrone/image_raw" to="/camera/image_raw" />
        <remap from="ardrone/camera_info" to="/camera/camera_info" />

    </node>

    <!-- Launch the static transform publisher for world to camera -->
    <!--<node pkg="tf" type="static_transform_publisher" name="cam_to_drone" args="1 0 0 0 0 0 world ardrone_base_frontcam 10"/>-->
    <!--<node pkg="tf" type="static_transform_publisher" name="world_to_cam" args="1 0 0.5 -1.57 0 -1.57 camera ardrone_base_frontcam 10"/>-->



    <!-- Launch the image rectification node -->
    <!--<node ns="camera" pkg="image_proc" type="image_proc" name="image_proc"/>-->

    <!-- Start the GSCAM node -->
    <!-- devide=/dev/video0 for webcam, /dev/video1 for usbcamera -->
    <!--<env name="GSCAM_CONFIG" value="v4l2src device=/dev/video0 ! video/x-raw-yuv,framerate=30/1,width=640,height=480 ! ffmpegcolorspace "/>
    <node pkg="gscam" type="gscam" name="gscam" output="screen">
        <param name="width" type="int" value="640"/>
        <param name="height" type="int" value="480"/>
        <param name="fps" type="int" value="30"/>
        <param name="frame_id" type="string" value="ardrone_base_frontcam"/>
        <param name="camera_info_url" type="string" value="file://$(find gscam)/camera_calibration.yaml"/>
    </node>-->

    <node name="ar_pose" pkg="ar_pose" type="ar_single" respawn="false" output="screen">
        <param name="marker_pattern" type="string" value="$(find ar_pose)/data/4x4/4x4_1.patt"/>
        <param name="marker_width" type="double" value="152.4"/>
        <param name="marker_center_x" type="double" value="0.0"/>
        <param name="marker_center_y" type="double" value="0.0"/>
        <param name="threshold" type="int" value="100"/>
        <param name="use_history" type="bool" value="true"/>
        <!--<param name="camera_image_topic" type="string" value="/ardrone/front/image_raw"/>
        <param name="camera_info_topic" type="string" value="/ardrone/front/camera_info"/>-->
    </node>

    <!-- Launch RVIZ as visualizer -->
    <!--<node pkg="rviz" type="rviz" name="rviz" args="-d $(find ar_pose)/launch/live_single.rviz"/>-->

</launch>
2016-04-14 09:40:37 -0500 answered a question ar_pose : interprete output of 'ar_pose_marker' && transform

Maybe the axis are not aligned, I'm just guessing. I'm working now on the same problem and I noticed the same as you. If you got an answer to this problem (since you worked on it two years ago) please answer me.

2016-04-14 09:34:51 -0500 answered a question How to use gscam with ar_pose?

Looking at your error seems that ar_single is not located, therefore it's not a problem relative to gscam. Did you source the setub.bash file after compiling ar_pose?

Anyway I attached you my launch file:

<launch>

  <!-- Launch RVIZ as visualizer -->
  <node pkg="rviz" type="rviz" name="rviz"
    args="-d $(find ar_pose)/launch/live_single.rviz"/>

  <!-- Launch the static transform publisher for world to camera -->
  <node pkg="tf" type="static_transform_publisher" name="world_to_cam"  args="0 0 0.5 -1.57 0 -1.57 world camera 10" />

  <!-- Launch the image rectification node -->
  <!--<node ns="camera" pkg="image_proc" type="image_proc" name="image_proc"/>-->

  <!-- Start the GSCAM node -->
  <!-- devide=/dev/video0 for webcam, /dev/video1 for usbcamera -->
<env name="GSCAM_CONFIG" value="v4l2src device=/dev/video0 ! video/x-raw-yuv,framerate=30/1,width=640,height=480 ! ffm$
  <node pkg="gscam" type="gscam" name="gscam" output="screen">
    <param name="width" type="int" value="640" />
    <param name="height" type="int" value="480" />
    <param name="fps" type="int" value="30" />
    <param name="frame_id" type="string" value="camera" />
    <param name="camera_info_url" type="string"
      value="file://$(find gscam)/camera_calibration.yaml" />
  </node>

  <node name="ar_pose" pkg="ar_pose" type="ar_single" respawn="false"
    output="screen">
    <param name="marker_pattern" type="string"
      value="$(find ar_pose)/data/4x4/4x4_1.patt"/>
    <param name="marker_width" type="double" value="152.4"/>
    <param name="marker_center_x" type="double" value="0.0"/>
    <param name="marker_center_y" type="double" value="0.0"/>
    <param name="threshold" type="int" value="100"/>
    <param name="use_history" type="bool" value="true"/>


  </node>
</launch>
2016-04-14 09:23:01 -0500 received badge  Famous Question (source)
2016-04-06 08:05:42 -0500 answered a question Ar_pose doesn't recognize tag in tum_simulator. Please help.

Hi green96, I'm trying to do the same as you. Can you tell me how you include a marker in gazebo? Thanks!

2016-03-15 12:09:46 -0500 asked a question ar_pose + ardrone_autonomy

Hi all,

I'm trying to perform marker detection using thear_pose package. Everything works fine using the package alone, either wih the webcam or an usb camera, getting the video stream with gscam.

I tried to write a launch file that launch ar_pose and ardrone_autonomy simultaneously, in order to use the frontcam of the quadcopter as camera for the acquisition. Here is the launch file:

<launch> <arg name="ip" default="192.168.1.1"/>

<!-- Ultrasound frequency (7 or 8). -->
<arg name="freq" default="8"/>
<node name="ardrone_driver" pkg="ardrone_autonomy" type="ardrone_driver" output="screen" clear_params="true" args="-ip $(arg ip)">
    <param name="outdoor" value="0"/>
    <param name="max_bitrate" value="4000"/>
    <param name="bitrate" value="4000"/>
    <param name="navdata_demo" value="0"/>
    <param name="flight_without_shell" value="0"/>
    <param name="altitude_max" value="4000"/>
    <param name="altitude_min" value="50"/>
    <param name="euler_angle_max" value="0.21"/>
    <param name="control_vz_max" value="700"/>
    <param name="control_yaw" value="1.75"/>
    <param name="detect_type" value="10"/>
    <param name="enemy_colors" value="3"/>
    <param name="detections_select_h" value="32"/>
    <param name="detections_select_v_hsync" value="128"/>
    <param name="enemy_without_shell" value="0"/>
    <param name="ultrasound_freq" value="$(arg freq)"/>
    <param name="realtime_navdata" value="true"/>
    <param name="realtime_video" value="true"/>
    <!-- Covariance Values (3x3 matrices reshaped to 1x9)-->
    <rosparam param="cov/imu_la">[0.1, 0.0, 0.0, 0.0, 0.1, 0.0, 0.0, 0.0, 0.1]</rosparam>
    <rosparam param="cov/imu_av">[1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 1.0]</rosparam>
    <rosparam param="cov/imu_or">[1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 100000.0]</rosparam>
</node>

<!-- Launch the static transform publisher for world to camera -->
<!--<node pkg="tf" type="static_transform_publisher" name="world_to_cam" args="0 0 0.5 -1.57 0 -1.57 world camera 10"/>
<node pkg="tf" type="static_transform_publisher" name="base_to_frontcam" args="0.21 0 0 -1.57 0 -1.57 ardrone_base_link ardrone_base_frontcam 10"/>
<node pkg="tf" type="static_transform_publisher" name="base_to_bottomcam" args="0 -0.02 0.0 3.14 0 1.57 ardrone_base_link ardrone_base_bottomcam 10"/>-->
<!--<node pkg="tf" type="static_transform_publisher" name="cam_to_frontcam" args="0.5 0 0 0 0 0 camera ardrone_base_frontcam 10"/>-->
<!--<node pkg="tf" type="static_transform_publisher" name="cam_to_dronecam" args="0.2 0 0 0 0 0 camera ardrone_base_frontcam 10"/>-->

<node pkg="tf" type="static_transform_publisher" name="cam_to_drone" args="1 0 0 0 0 0 world ardrone_base_frontcam 10"/>

<!-- Launch the image rectification node -->
<node ns="camera" pkg="image_proc" type="image_proc" name="image_proc"/>

<!-- Start the GSCAM node -->
<!--<env name="GSCAM_CONFIG" value="v4l2src device=/dev/video0 ! video/x-raw-yuv,framerate=30/1,width=640,height=480 ! ffmpegcolorspace "/>
<node pkg="gscam" type="gscam" name="gscam" output="screen">
    <param name="width" type="int" value="640"/>
    <param name="height" type="int" value="480"/>
    <param name="fps" type="int" value="30"/>
    <param name="frame_id" type="string" value="ardrone_base_frontcam"/>
    <param name="camera_info_url" type="string" value="file://$(find gscam)/camera_calibration.yaml"/>
</node>-->

<node name="ar_pose" pkg="ar_pose" type="ar_single" respawn="false" output="screen">
    <param name="marker_pattern" type="string" value="$(find ar_pose)/data/4x4/4x4_1 ...
(more)
2016-03-03 09:52:26 -0500 received badge  Editor (source)
2016-03-03 06:04:20 -0500 received badge  Notable Question (source)
2016-03-02 10:52:59 -0500 asked a question Error using gscam with ar_pose

Hi everyone,

I am trying to use ar_pose for marker recognition, using gscam fro acquiring the video stream. This is my launch file:

<launch>

  <!-- Launch RVIZ as visualizer -->
  <node pkg="rviz" type="rviz" name="rviz"
    args="-d $(find ar_pose)/launch/live_single.rviz"/>

  <!-- Launch the static transform publisher for world to camera -->
  <node pkg="tf" type="static_transform_publisher" name="world_to_cam"
    args="0 0 0.5 -1.57 0 -1.57 world camera 10" />

  <!-- Launch the image rectification node -->
  <node ns="camera" pkg="image_proc" type="image_proc" name="image_proc"/>

  <!-- Start the GSCAM node -->
  <env name="GSCAM_CONFIG" value="v4l2src device=/dev/video0 ! video/x-raw-yuv,framerate=30/1,width=640,height=480 ! ffmpegcolorspace " />
  <node pkg="gscam" type="gscam" name="gscam" output="screen">
    <param name="width" type="int" value="640" />
    <param name="height" type="int" value="480" />
    <param name="fps" type="int" value="30" />
    <param name="frame" type="string" value="camera" />
    <param name="device" type="string" value="/dev/video0" />
    <param name="camera_info_url" type="string"
      value="file://$(find gscam)/camera_calibration.yaml" />
    <!--<remap from="camera/image_raw" to="image_raw" />-->
  </node>
  <node name="ar_pose" pkg="ar_pose" type="ar_single" respawn="false"
    output="screen">
    <param name="marker_pattern" type="string"
      value="$(find ar_pose)/data/4x4/4x4_1.patt"/>
    <param name="marker_width" type="double" value="152.4"/>
    <param name="marker_center_x" type="double" value="0.0"/>
    <param name="marker_center_y" type="double" value="0.0"/>
    <param name="threshold" type="int" value="100"/>
    <param name="use_history" type="bool" value="true"/>
  </node>
</launch>

My problem is that I don't get any video stream from the camera, even if it is turned on (light is on). Here is the screen of rviz, tftree and node graph: Screenshot rviz/tftree/nodegraph

The only warning I get in the console is the following (rised after some modification that i forgot):

[ WARN] [1456937185.174151008]: No camera frame_id set, using frame "/camera_frame".

Instead of making a remap from gscan to uvc_camera(the package used in the original launch file) I modified the launch file. If I run gscam alone it works and I get the video, but when I try to use this launch file it doesn't.

EDIT: Changing the param "frame" to "frame_id" I solved all the problems and now I can see the video stream correctly and the marker are identified.

2016-03-02 10:20:43 -0500 answered a question Installing artoolkit with Ros

Hi, I tried to compile from home instead from university and it worked. Apparently svn was blocked! Cheers!

2016-02-25 08:56:12 -0500 received badge  Popular Question (source)
2016-02-23 04:38:35 -0500 asked a question Installing artoolkit with Ros

Hi everyone,

for my project I need to use Artoolkit with Ros. I downloaded the package from https://github.com/ar-tools/ar_tools but when I tried to compile I got the following error:

Built target opencv_apps svn: E000110: Unable to connect to a repository at URL 'svn://svn.code.sf.net/p/artoolkit/code/trunk/artoolkit' svn: E000110: Can't connect to host 'svn.code.sf.net': Connection timed out make[2]: * [ar_tools/artoolkit/ARToolkit-prefix/src/ARToolkit-stamp/ARToolkit-download] Interrupt make[1]: [ar_tools/artoolkit/CMakeFiles/ARToolkit.dir/all] Interrupt make: ** [all] Interrupt

Thinking that my university network blocks the svn I tried to use a VPN but without success:

svn: E670002: Unable to connect to a repository at URL 'svn://svn.code.sf.net/p/artoolkit/code/trunk/artoolkit' svn: E670002: Unknown hostname 'svn.code.sf.net' make[2]: * [ar_tools/artoolkit/ARToolkit-prefix/src/ARToolkit-stamp/ARToolkit-download] Error 1 make[1]: [ar_tools/artoolkit/CMakeFiles/ARToolkit.dir/all] Error 2 make: ** [all] Error 2 Invoking "make -j4 -l4" failed

Without compiling the package I can't run some important nodes (e.g. ar_single). Did someone of you have the same problem and was able to solve it?

Edit: This is the result of the curl command

curl: (6) Could not resolve host: svn.code.sf.net

2015-06-08 12:14:26 -0500 commented answer How to solve "Couldn't find an AF_INET address for " problem

Thank you! In my case, having two ip address on the robot and two on the workstation, this was the only one working solution!

2015-05-29 03:36:31 -0500 received badge  Enthusiast
2015-05-18 02:37:13 -0500 received badge  Supporter (source)