ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

skywalker's profile - activity

2017-03-20 17:07:04 -0500 received badge  Notable Question (source)
2017-03-20 17:07:04 -0500 received badge  Famous Question (source)
2017-02-12 15:00:18 -0500 received badge  Supporter (source)
2017-02-08 10:41:16 -0500 commented answer Datatype to access pointcloud2

@peci1 Thanks.

2017-02-08 09:51:24 -0500 commented answer Datatype to access pointcloud2

@peci1 Can you be more specific on time it takes? My pcl is definitely under 100k points.

2017-02-07 14:17:48 -0500 commented answer Datatype to access pointcloud2

Thanks for your reply @peci1, it only worked when I add #include <tf2_sensor_msgs/tf2_sensor_msgs.h> to my code. I have another question tough. Is it normal that doTransform() takes so long? It takes around 0.44 seconds and I have pretty decent computer. Is pcl_ros faster?

2017-02-06 21:09:13 -0500 commented answer Datatype to access pointcloud2

I'm having this error:

/home/tarik/ros_ws/src/object_seg/src/cloud_transformer.cpp:6:29: fatal error: tf2_sensor_msgs.h: No such file or directory

#include <tf2_sensor_msgs.h>

I've tried adding tf2_msgs to my package.xml and CMake file.

2016-09-19 22:29:44 -0500 received badge  Popular Question (source)
2016-06-13 09:04:09 -0500 asked a question slow updates of links of a robot in RViz [RobotModel]

Hello,

I have written a description package for iCub humanoid robot. It basically consist of a urdf model of the robot to visualize in RViz and also a ros-yarp bridge which publishes the joint positions of all joints on the robot. (for now bridge takes the data from robot simulator.)

The /joint_states topic is published at 50 Hz from the bridge and I am running launch file with /robot_state_publisher as it follows:

<?xml version="1.0"?>
<launch>

  <!-- Load the URDF into the ROS Parameter Server -->
  <param name="robot_description" command="cat $(find icub_description)/urdf/icub.xml" />
  <!-- <param name="publish_frequency" value="200" /> -->

  <node name="icub_tf_publisher" pkg="icub_description" type="icub_tf_publisher" />

  <node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher" />

  <!-- Launch rviz node for visualization -->
  <node name="rviz" pkg="rviz" type="rviz" args="-d $(find icub_description)/rviz/display_urdf.rviz" />
</launch>

But in this situation, the RobotModel in RViz follows the simulator with almost 1 second delay even robots move only one joint. I have looked every topic (joint_states, tf etc..) and they are all fine publishing the data at desired rate.

Do you have any idea why RobotModel lags ? The package can be found on here.

Edit: According to the tf frame tree most of the tf data is published at 1.250Hz which is very slow. Only base_link is published at 50Hz. So the problem is somehow related to /robot_state_publisher

2015-08-10 13:34:10 -0500 answered a question ERROR: Laser has to be mounted planar

This answer in VREP forum might help you : http://www.forum.coppeliarobotics.com...

2015-07-16 12:31:06 -0500 commented answer Skeletal tracker for Asus Xtion Pro Live

hi Chaos, thank you for your edits everything works fine except video stream. I get video stream but frequency is around 4 hz from /camera/rgb/image topic which is supposed to be around 30 hz. Do you get proper frequency ?

2015-06-24 14:52:10 -0500 received badge  Enthusiast
2015-04-24 14:21:45 -0500 received badge  Editor (source)
2015-04-24 12:04:49 -0500 answered a question Why is the step to download the KinectSensor binaries needed?

Because the turtlebot indigo software is configured to work with the Asus Xtion Pro. You need to download Microsoft Kinect drivers and set environment variable for kinect. You can find drivers here and tutorial for installing it here.

After installing driver go to this link and configure environment variable for kinect.

I don't know why they didn't mention about KinectSensor driver in the official turtlebot documentation.

2015-04-03 13:53:46 -0500 commented answer Kinect Installation and Setup on ROS [Updated]

I get same log, but it didn't affect anything. I can run RViz without problems. No idea why it comes out.

2015-03-29 07:04:26 -0500 commented answer No matching devices founded error for gmapping_demo.launch (turtlebot)

Actually, I solved the problem by re-arranging environment PATH for kinect as described here.

2015-03-29 07:03:07 -0500 received badge  Famous Question (source)
2015-03-27 10:27:08 -0500 received badge  Notable Question (source)
2015-03-27 09:38:32 -0500 received badge  Popular Question (source)
2015-03-18 15:13:27 -0500 asked a question No matching devices founded error for gmapping_demo.launch (turtlebot)

Note: I solved the problem. It seems I set environment paths wrong for Kinect as described here.

I am encountering with the following error when I try to run :

roslaunch turtlebot_navigation gmapping_demo.launch

Error:

[ INFO] [1426701074.584654164]: No matching device found.... waiting for devices. Reason: std::string openni2_wrapper::OpenNI2Driver::resolveDeviceURI(const string&) @ /tmp/buildd/ros-indigo-openni2-camera-0.2.3-0trusty-20150221-1448/src/openni2_driver.cpp @ 623 : Invalid device number 1, there are 0 devices connected.

Strange thing when I run :

roslaunch openni_launch openni.launch

I don't get any errors and I am able to see rgb and depth output of kinect sensor.

I think it shows that I installed Kinect driver properly. But there is something wrong with gmapping_demo that lets not to find Kinect on the USB ports.

As suggested here I installed freenect_launch and run :

roslaunch freenect_launch freenect.launch

And now I am able to get rgb and depth data on Rviz from TurtleBot's Kinect (workstation) Because gmapping_demo uses openni_launch as default I need to change launch file but not quite sure it will let the gmapping_demo use freenect_launch.

My setup : Asus netbook X201E for TurtleBot Lenovo E440 for workstation Both have Ubuntu 14.04.02 with Indigo

2015-03-18 15:13:26 -0500 commented answer Kinect Installation and Setup on ROS [Updated]

Actually it says that openni2 does not support any of Kinects. openni should support. btw. I was encountering with the same problem but freenect_launch solved it. I am able to get rgb and depth data On the other side I have to change lacun files which calls openni_launch as default with freenect