ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

agonzamart's profile - activity

2014-10-07 14:12:28 -0500 received badge  Taxonomist
2014-01-28 17:29:56 -0500 marked best answer Problem with ros networking

I am using a Desktop Pc and a Remote pc on a pioneer base as my Ros fuerte architecture. On this one I ran my roscore and some nodes to interact with the pioneer. The rest nodes I run them on the PC. I am having a problem with the network. When I launch a launch file from ros with the machine tag from the Desktop computer to the laptop on the pioneer.

<machine name="higgs" address="higgs" env-loader="/opt/ros/fuerte/env.sh"/>

as explained here: ros.org/wiki/roslaunch/XML/machine

I get an error:

remote[higgs-0]: failed to launch on higgs: Unable to establish ssh connection to [higgs:22]: No authentication methods available

But I am able to connect each machines using the ssh command bidirectionally. I use the ROS variables ROS_HOSTNAME(name of each machine) ROS_MASTER_URI(higgs_ip). You can rostopic echo from the different machines, or see the rviz on the Desktop computer from the information on the sensor on the pioneer base. But I have to launch the nodes on the laptop directly.

I will appreciate any help.

2013-07-22 10:10:44 -0500 received badge  Famous Question (source)
2013-07-13 13:36:12 -0500 commented question tf transformation for simulated kinect Gazebo (How to publish Gazebo Time?)

Gazebo is not publishing in the /clock topic. I have /use_sim_time set to "true". Is there anything else needed to publish Gazebo's clock? Thats the simpler I could thought. I wanted to upload the frames.pdf but I do not have enough Karma to do so, that's why I had to do it that way.

2013-07-13 10:26:49 -0500 received badge  Student (source)
2013-07-13 10:26:12 -0500 received badge  Notable Question (source)
2013-07-13 07:07:32 -0500 commented answer openni_launch - ROS Fuerte

http://www.ros.org/wiki/rviz/DisplayTypes You can see the displays here, to see what topic to read. The one you need is the image one. Then you need to clic on the new display on the left spread it and choose the topic you have to read. http://www.ros.org/wiki/openni_launch, check tutorials

2013-07-13 06:57:01 -0500 answered a question openni_launch - ROS Fuerte

Right now, if you are working on fuerte you only have to roslaunch openni_launch openni.launch this will launch all the nodes related to the kinect. If everything is correct you should se the topics with a rostopic list.

/camera/rgb/camera_info /camera/rgb/image_raw ... A lot more.

You can be sure you are receiving info by typing in the terminal rostopic echo /camera/rgb/image_raw a lot of numbers will full your screen in a few secs.

Open Rviz add a display for the topic you want to display. And you should see the data if everything is correct.

2013-07-11 11:18:45 -0500 commented answer Openni+Kinect+Other Machines

As fas as I know, you have to run openni on the machine you plug the kinect in. The driver must run inside the machine connected directly to the kinect so you have to install the openni there.

2013-07-10 13:56:38 -0500 received badge  Necromancer (source)
2013-07-10 13:56:38 -0500 received badge  Teacher (source)
2013-07-10 05:23:17 -0500 answered a question How to use libgazebo_ros_camera.so in gazebo (urdf)

I am using libDepthCameraPlugin.so and libgazebo_ros_openni_kinect.so and I am able to publish images from this camera. My .sdf looks like this:

< sensor type="depth" name="camera1">

   <always_on>1</always_on>
   <visualize>true</visualize>
    <update_rate>30.0</update_rate>
    <topic>camera</topic>
    <camera name="depth_cam">
         <horizontal_fov>1.3962634</horizontal_fov>
        
        <clip>
            <near>0.02</near>
            <far>300</far>
        </clip>
        <noise>
             <type>gaussian</type>
                <!-- Noise is sampled independently per pixel on each frame.  
                That pixel's noise value is added to each of its color
                channels, which at that point lie in the range [0,1]. -->
            <mean>0.0</mean>
            <stddev>0.007</stddev>
         </noise>
        <save enabled=true>  <!-- Nose que es esto -->
            <path>Gazebo_camera</path>
        </save>
  </camera>
  <plugin name="camera_controller" filename="libDepthCameraPlugin.so">
    <CxPrime>0</CxPrime>
    <updateRate>10.0</updateRate>
    <cameraName>depth_cam1</cameraName>
    <frameName>/base_link</frameName>
    <hackBaseline>0.07</hackBaseline>
    <distortionK1>1.0</distortionK1>
    <distortionK2>1.0</distortionK2>
    <distortionK3>1.0</distortionK3>
    <distortionT1>1.0</distortionT1>
    <distortionT2>1.0</distortionT2>
  </plugin>
 <plugin name="kinect" filename="libgazebo_ros_openni_kinect.so">
    <CxPrime>0</CxPrime>
    <updateRate>10.0</updateRate>
    <imageTopicName>image_raw</imageTopicName>
    <pointCloudTopicName>points</pointCloudTopicName>
    <depthImageTopicName>image_depth</depthImageTopicName>
    <depthImageCameraInfoTopicName>depth_camera_info</depthImageCameraInfoTopicName>
    <pointCloudCutoff>0.001</pointCloudCutoff>
    <cameraName>kinect</cameraName>
    <frameName>/base_link</frameName>
    <distortionK1>0.00000001</distortionK1>
    <distortionK2>0.00000001</distortionK2>
    <distortionK3>0.00000001</distortionK3>
    <distortionT1>0.00000001</distortionT1>
    <distortionT2>0.00000001</distortionT2>
  </plugin>
</sensor>
</link>

Hope this helps! But I have a problem with the frames time stamp and I can not do much with this camera image right know, I am looking for the solution: http://answers.ros.org/question/66800/tf-transformation-for-simulated-kinect-gazebo/

2013-07-10 02:10:59 -0500 commented question error running hector_quadrotor tutorial

@Johannes Meyer What you commented means that getting those errors about the missing element description are normal? That means that it is not possible to tell gazebo what topic to publish the information. You can not tell Gazebo to change the frame name for a camera for example?Thx!

2013-07-10 01:39:41 -0500 received badge  Popular Question (source)
2013-07-10 01:19:47 -0500 commented answer tf transformation for simulated kinect Gazebo (How to publish Gazebo Time?)

I realized how view_frames worked in groovy, it is not updated in the wiki and I was used to call it as a rosrun as in the older versions. But thank you for your answer. I will update with the output of view_frames but it seems that everything is connected, so I am still searching for what is wrong.

2013-07-08 08:37:58 -0500 commented answer tf transformation for simulated kinect Gazebo (How to publish Gazebo Time?)

I updated the question with the tf_monitor output.

2013-07-08 08:33:40 -0500 commented answer tf transformation for simulated kinect Gazebo (How to publish Gazebo Time?)

I tried to use tf view_frames, because I am using groovy and seems like view_frames is not implemented. I have had this same problem previously working with Ros, the error seems to be that my messages for the tf transform depth_camera are removed because they are too old.

2013-07-08 08:33:40 -0500 received badge  Commentator
2013-07-08 04:20:56 -0500 asked a question tf transformation for simulated kinect Gazebo (How to publish Gazebo Time?)

I have a Gazebo-Ros simulation where I use a range laser to get scans which is working properly and I am able to use the autonomous navigation on a simulated environment. Now I wanted to use a kinect as a sensor to get scans using the depthimage to laserscan node to substitute the range laser. After simulating this kinect, including all the different config files, nodes... I am able to read the scans after the cloudthrottle, so I think I have much of it correctly configured, but I am having problems with the tf's tranformations for this simulated kinect, when I try to read the topic inside rviz I can not get the transform between /map and the kinect frame I publish, I think is only a matter of not publishing the right tf transform for the kinect or maybe I need another I am not aware of. Right now for this kinect I only publish a static transform for the output frame id (depth_cam) to base_link

After remapping the output frame topic to /camera_depth

< node pkg="tf" type="static_transform_publisher" name "depth_cam_tf" args="0 -0.02 0 0 0 0 /base_link /camera_depth >

output from tf tf_monitor

RESULTS: for all Frames

Frames:

Frame: /Pioneer3AT/Sonar_1 published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499152 Max Delay: 0

Frame: /Pioneer3AT/base_link published by /Pioneer3AT_Gazebo Average Delay: -0.0493165 Max Delay: 0

Frame: /Pioneer3AT/camera_depth published by /depth_cam_tf Average Delay: -0.0996457 Max Delay: 0

Frame: /Pioneer3AT/front_left_axle published by /Pioneer3AT_tf_broadcaster Average Delay: -0.0492777 Max Delay: 0

Frame: /Pioneer3AT/front_left_hub published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499125 Max Delay: 0

Frame: /Pioneer3AT/front_left_wheel published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499122 Max Delay: 0

Frame: /Pioneer3AT/front_right_axle published by /Pioneer3AT_tf_broadcaster Average Delay: -0.0492735 Max Delay: 0

Frame: /Pioneer3AT/front_right_hub published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499119 Max Delay: 0

Frame: /Pioneer3AT/front_right_wheel published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499117 Max Delay: 0

Frame: /Pioneer3AT/front_sonar published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499114 Max Delay: 0

Frame: /Pioneer3AT/laserscan published by /Pioneer3AT_laserscan_tf Average Delay: -0.0326276 Max Delay: 0

Frame: /Pioneer3AT/odom published by /Pioneer3AT/amcl Average Delay: -0.494709 Max Delay: 0

Frame: /Pioneer3AT/rear_left_axle published by /Pioneer3AT_tf_broadcaster Average Delay: -0.0492701 Max Delay: 0

Frame: /Pioneer3AT/rear_left_hub published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499111 Max Delay: 0

Frame: /Pioneer3AT/rear_left_wheel published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499108 Max Delay: 0

Frame: /Pioneer3AT/rear_right_axle published by /Pioneer3AT_tf_broadcaster Average Delay: -0.0492668 Max Delay: 0

Frame: /Pioneer3AT/rear_right_hub published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499105 Max Delay: 0

Frame: /Pioneer3AT/rear_right_wheel published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499102 Max Delay: 0

Frame: /Pioneer3AT/rear_sonar published by /Pioneer3AT_tf_broadcaster Average Delay: -0.4991 Max Delay: 0

Frame: /Pioneer3AT/top_plate published by /Pioneer3AT_tf_broadcaster Average Delay: -0.499128 Max Delay: 0

All Broadcasters:

Node: /Pioneer3AT/amcl 18.6052 Hz, Average Delay: -0.494709 Max Delay: 0

Node: /Pioneer3AT_Gazebo 48.0592 Hz, Average Delay: -0.0493165 Max Delay: 0

Node: /Pioneer3AT_laserscan_tf 30.4796 Hz, Average Delay: -0.0326276 Max ... (more)

2013-07-05 02:32:19 -0500 commented answer What do you think about this guide in spanish? (In progress)

Ok @gustavo.velascoh I am now just beginning to write that part of my thesis so we keep in touch and I surely can help you with that if we are able to help to develop an spanish wiki it would be so great! :)

2013-07-04 07:43:11 -0500 commented answer What do you think about this guide in spanish? (In progress)

I am making my final master project using Ros I am Spanish and I am making the document in Spanish much of it is explaining the use of Ros, navigation and localization libraries, simulation using gazebo... so I think I can help you with this.

2013-06-22 07:48:12 -0500 commented question Gazebo sdf Kinect

Where you able to solve this problem? I am having kind of the same issues when trying to simulate the kinect. It would be great if you could help me! Because right now I have kind of the same .sdf you had when you asked the question.

Thank you!

2013-04-25 05:46:21 -0500 commented answer AMCL and Kinect problem

I think I have been able to solve the problem. I am launching different types of configurations for my robot to navigate autonomously, perhaps some parameters in the param server got mixed around, from other tests I perform and that's why sometimes I had the error and others don't. Now I basically check out that I am launching a new roscore in a window of my robot PC's every time I want to run a new check. Secondly I had a problem with the declaration of my tf's I was missing a tf transformation in the test from the kinect a transformation between /base_footprint and /base_link <node pkg="tf" type="static_transform_publisher" name="higgs_kinect3" args="0.0 0.0 0.0 0 0 0 1 /base_footprint /base_link 100" /> Now I have it working perfectly.

2013-04-23 13:48:20 -0500 commented answer AMCL and Kinect problem

Thank you for the answer, but that can't be the problem because I have the variables sourced in the .bashrc from both computers.

2013-04-23 07:31:04 -0500 answered a question AMCL and Kinect problem

I am experiencing kind of the same problem. Sometimes my tf transformation /odom published from amcl does not exist. And I say sometimes because this is what is mostly frustrating me that sometimes it works properly and I get the transformation and others don't. This dynamic transformation should be done automatically by the amcl so I do not know what it could be happening or how to fix it. I remap also the topic from scan to kinect_narrow_scan and I do not recive any message in rviz although I see the info when I read it with a rostopic echo /kinect_narrow_scan exactly the same way as you.

When it is working properly and I run a rosrun tf tf_monitor I see the following:

(other frames) Frame: /odom published by /amcl Average Delay: 0.0081728 Max Delay: 0.0597752

But sometimes I don't have this transformation published and rviz has the message that no tf transform exists between /map and /odom. I have been with this for about a month and I am not able to find any solution. ¿How does acml set the transformation between /map and /odom? ¿any ideas?

Thank you all!

2013-04-18 12:41:06 -0500 commented answer simulate pioneer3at

Ok no problem, I think I so it somewhere to but I am not able to find it again. I think it was on the old gazebo model's. If you are able to find it please send me the link to it. Thanks! :)

2013-04-18 11:01:53 -0500 commented answer simulate pioneer3at

Derek you know where I can find the model of a LSM200 laser instead of the hoyuko one? I can not find the model in the library of Gazebo in order to change it. I really appreciate your help.

2013-04-11 03:07:39 -0500 commented question Problem with ros networking

Thank you for your help tbh.

I have set the sshkeys correctly, I do not have to enter the password anymore. But I am still experiencing the same error.

remote[higgs-0]: failed to launch on higgs: Unable to establish ssh connection to [higgs:22]: No authentication methods available

Maybe is the fact that I am using different users in each machine?

I have checked .ssh/autorized_keys in higgs (robot laptop, where I want my remote launch to execute)

ssh -rsa blablabla agonzamart@arturo-desktop (my user and computer) ssh -rsa blablabla higgs@higgs (the user I use in the robot laptop) ssh -rsa blablabla root@arturo-desktop (root user from my computer)

I will appreciate any help.