ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Grega Pusnik's profile - activity

2021-08-10 06:50:56 -0500 received badge  Nice Question (source)
2016-03-02 18:22:12 -0500 marked best answer AutonomyLab Ardrone wind_comp_angle

Hey,

Can anybody help me how to use and what exactly does this wind comp angle represent.

I'm having problem when issuing command to change height only. It often happens that the drone drifts away because of the wind. Drone does not stabilize itself. I guess that I should use wind_comp_angle to compensate for that wind, to stabilize the drone.

Thank's for help.

2016-02-25 05:41:54 -0500 received badge  Great Answer (source)
2016-02-25 05:41:54 -0500 received badge  Guru (source)
2015-11-26 03:25:02 -0500 marked best answer Synchronizing multiple robots from 1 machine and 1 ros master

Hey,

Are there any tips or common practice how to synchronize 2 or more robots to do the same "move". Example: when all robots are started up and waiting, both go forward for 10 seconds, starting at the same time.

I was thinking with 1 master node and X robot nodes. Robot nodes publish their state to the master node by calling master's node service. Master node waits until all robot nodes are ready. Then when all robots are ready, it publishes a msg on a topic. This topic is constantly checked by the robot nodes and when the msg is published, they start the move. Should this be synchronous enough?

Are there any other possibilities how to synchronize moves, or maybe one more simple than mine described above?

Thanks

2015-06-30 19:02:03 -0500 marked best answer setting min depth in openni_kinect

How do you setup minimum still acceptable depth in openni_kinect. The problem is when trying to simulate Turtlebot and building maps with SLAM. Min depth is set too low and simulated laser picks up turtlebot itself resulting in poorly build maps.

Or can I adjust minimum depth somewhere else?

Thank you.

2015-01-21 01:34:53 -0500 received badge  Famous Question (source)
2014-05-07 03:44:20 -0500 received badge  Favorite Question (source)
2014-01-28 17:30:15 -0500 marked best answer Control multiple Ar Drones

Has anyone been able to control multiple AR Drones with any of the ROS drivers?

I'm trying to fly them using https://github.com/AutonomyLab/ardrone_autonomy

But cannot get multiple drone nodes started. Drones are connected to one router and I can ping them all. I'm having the issue described here:

https://github.com/AutonomyLab/ardrone_autonomy/issues/56

Thanks for the help.

2014-01-28 17:27:51 -0500 marked best answer installing ROS from source, actionlib error

I'm trying to install ROS from source. First problem that I ran into is that the git repo for actionlib does not exists any more.

Exception caught during install: Error processing 'actionlib' : [actionlib] Checkout of git://github.com/wg-debs/actionlib-release.git version debian/ros-fuerte-actionlib_1.8.6_lucid into /home/gloin/ros-underlay/actionlib failed.

Does anybody know for some other repo for actionlib or how to continue with instalation process.

Thank you.

2014-01-28 17:27:43 -0500 marked best answer turtlebot.eu power board and kinect

Hey,

I have a question regarding turtlebot.eu power board. I don't have any external battery so everything is power through roomba's battery. Kinect is connected on 12v dc power connector (spec at max 350mA) on the power board.

Everything works normally. I can control roomba, get video feed from kinect cameras. But I do have a problem with pointcloud data. It starts completely normal. I am receiving pointcloud normally for approximately a minute. After that it just stops. No error messages, the image just freezes. Roomba works all the time withouth problem (teleop, dashboard). It seems like that the openni freezes. Is it possible that kinect doesn't get enought power, that I should use external battery? Or maybe is there a problem with openni kinect driver?

The same problem is when I try to build a SLAM map. It works normally for about a minute then the laser topic stops posting. Since laser is build from pointcloud I think problems are connected.

Also I notice, when I watch only IR stream from kinect, the IR light on kinect flashes slowly. It is not constantly lit. Is that normal? Like that the IR camera is switching between ON and OFF.

Does anybody have the same problem? I start turtlebot with

roslaunch turtlebot_bringup minimal.launch

and start openni with

roslaunch openni_launch openni.launch

Then I run

rosrun rviz rviz

to get the image.

Thank you for your help.

2014-01-28 17:27:29 -0500 marked best answer ppl_detection package and training data

Do you know if you have to train ppl_detection package or should work at it's best out of the box. Because I get very poor results. Almost every object is human.

Thank you.

2014-01-28 17:26:39 -0500 marked best answer pcl_tracking in perception_pcl_electric_unstable gives nodelet error

Hey,

I'm trying to implement tracking using pcl::tracking library. Everything builds normaly, but when I try to run nodelet I get an error:

     Failed to load nodelet [/Detection_Tracking] of type [Detector/Detection_Tracking]: 
Failed to load library /home/grega/ros_workspace/diploma/Detector/lib/libDetector.so. 
Make sure that you are calling the PLUGINLIB_REGISTER_CLASS macro in the library code, 
and that names are consistent between this macro and your XML. 
Error string: Cannot load library: /home/grega/ros_workspace/diploma/Detector/lib/libDetector.so: 
undefined symbol: _ZN3pcl8tracking21ParticleFilterTrackerINS_12PointXYZRGBAENS0_14ParticleXYZRPYEE11initComputeEv
[FATAL] [1342859395.359533594]: Service call failed!

Do I have to link somehow libpcl_tracking.so in CMakeList.txt? Shouldn't be that done automatically when I add pcl dependency in manifest.xml?

I'm using Ubuntu and ROS Electric.

Thank's for your help.

2014-01-28 17:26:32 -0500 marked best answer pcl down sampling and filtering - retaining width and height (organized cloud)

Is it possible to retain width and height (organized cloud) after doing some operations with PCL library? I am doing down sampling with VoxelGrid and then PassThrough filtering. In every step I loose width and height data and get unorganized Point Cloud eventhough I start with organized cloud from Kinect.

Thank you for help.

2014-01-28 17:25:57 -0500 marked best answer Unscented Kalman Filter - Bayes++

Hey,

Does anybody have any experience how to include UKF from Bayes++ into ROS. I can't get it to build.

I'm constantly getting errors like:

...Bayes++/BayesFilter/bayesFlt.hpp:588: undefined reference to `VTT for Bayesian_filter::Unscented_scheme'

Any tips are welcome.

2014-01-28 17:25:31 -0500 marked best answer Euclidean Cluster Extraction

I'm trying to extract clusters from point cloud. I'm doing it with help of tutorial:

PCL Tutorial

Everything works fine, even point clouds show if I save them in .pcd file, but when I try to view them in rviz nothing is seen. If I subscribe to appropriate topic no points are received. It says status: error, Transform: For frame []: Frame [] does not exist.

The funny thing is if I do: rostopic echo clusters I see that points are generated on that topic. Do I have to add any post processing to the extracted clusters to see them in rviz?

I'm using ubuntu, ros electric and perception_pcl_electric_unstable.

Here is some of my code:

    sensor_msgs::PointCloud2::Ptr clusters (new sensor_msgs::PointCloud2);
pcl::PointCloud<pcl::PointXYZ>::Ptr input_cloud (new pcl::PointCloud<pcl::PointXYZ>);
pcl::PointCloud<pcl::PointXYZ>::Ptr clustered_cloud (new pcl::PointCloud<pcl::PointXYZ>);
    /* Creating the KdTree from input point cloud*/
pcl::search::KdTree<pcl::PointXYZ>::Ptr tree (new pcl::search::KdTree<pcl::PointXYZ>);
tree->setInputCloud (input_cloud);

/* Here we are creating a vector of PointIndices, which contains the actual index
 * information in a vector<int>. The indices of each detected cluster are saved here.
 * Cluster_indices is a vector containing one instance of PointIndices for each detected 
 * cluster. Cluster_indices[0] contain all indices of the first cluster in input point cloud.
 */
std::vector<pcl::PointIndices> cluster_indices;
pcl::EuclideanClusterExtraction<pcl::PointXYZ> ec;
ec.setClusterTolerance (0.06); 
ec.setMinClusterSize (30);
ec.setMaxClusterSize (600);
ec.setSearchMethod (tree);
ec.setInputCloud (input_cloud);
/* Extract the clusters out of pc and save indices in cluster_indices.*/
ec.extract (cluster_indices);

/* To separate each cluster out of the vector<PointIndices> we have to 
 * iterate through cluster_indices, create a new PointCloud for each 
 * entry and write all points of the current cluster in the PointCloud. 
 */

std::vector<pcl::PointIndices>::const_iterator it;
std::vector<int>::const_iterator pit;
for(it = cluster_indices.begin(); it != cluster_indices.end(); ++it) {
        pcl::PointCloud<pcl::PointXYZ>::Ptr cloud_cluster (new pcl::PointCloud<pcl::PointXYZ>);
        for(pit = it->indices.begin(); pit != it->indices.end(); pit++) {
        //push_back: add a point to the end of the existing vector
                cloud_cluster->points.push_back(input_cloud->points[*pit]); 
        }

        //Merge current clusters to whole point cloud
    *clustered_cloud += *cloud_cluster;

  }

pcl::toROSMsg (*clustered_cloud , *clusters);
publish_clusters.publish (*clusters);
2014-01-28 17:25:16 -0500 marked best answer leg_detector and kinect

Hey,

I am trying to run a leg_detector2 package from here:

http://code.google.com/p/wu-robotics/source/browse/#svn%2Ftrunk%2Fpeople2

Has anyone had success to run it with kinect? I think that it should not be the problem. I successfully remapped kinect's laser scan to base_scan. But now I don't now how to read or visualize detected legs or any data produced by leg_detector.

I would really appreciate if anybody has any tips.

Thank you.

2014-01-28 17:24:48 -0500 marked best answer turtlebot + gazebo + simulated kinect

Hello,

I'm having problem with reading simulated Kinect raw or rgb image. I'm running electric with turtlebot_gazebo on Ubuntu 10.04. I don't get any of the data to build point cloud in rviz. It is like that Kinect is not publishing to any of the topics even though I see /cloud_throttled [sensor_msgs/PointCloud2] topic.

The only thing I can read (I assume from kinect?) and visualize in rviz is topic /scan [sensor_msgs/LaserScan].

Oh, and one quick question. Should turtlebots dashboard work in simulatated environment? Because if I call rosrun turtlebot_dashboard turtlebot_dashboard the dashboard does come up, but only the middle icon under diagnostic is green.

Thank you for your help!

2014-01-28 17:24:46 -0500 marked best answer simulated turtlebot kinect data

Hello,

Is it possible to get simulated raw image or create Point Cloud in Rviz with simulated Kinect and simulated Turtlebot "out of the box"? Because I can only get laser data posted on /scan topic. I tried to build Point Cloud but there is no published topic to do that. Is that even possible in simulated environment or do I have to reroute some of the data to other topic.

I'm using Ubuntu 11.10 and Electric.

I would really appreciate any tips because I'm new to ROS :)

Thank you.

2014-01-28 17:24:44 -0500 marked best answer Simulated turtlebot dashboard problem

I'm new with ROS and Turtlebot. So I have very basic question. Should turtlebot dashboard work in simulated turtlebot. Because when I run:

rosrun turtlebot_dashboard turtlebot_dashboard

Dashboard opens, but only the middle button under diagnostic is green. Others are gray. Also when I click on the "wrench" under Diagnostic I get info that everything is in Stale. Is this normal in simulated environment?

Thank you for your help!