ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

toniOliver's profile - activity

2016-09-28 22:00:53 -0500 received badge  Favorite Question (source)
2016-09-27 00:57:05 -0500 received badge  Famous Question (source)
2016-03-03 05:02:51 -0500 received badge  Popular Question (source)
2016-03-03 05:02:51 -0500 received badge  Famous Question (source)
2016-03-03 05:02:51 -0500 received badge  Notable Question (source)
2015-08-11 15:23:16 -0500 received badge  Famous Question (source)
2014-07-15 21:35:58 -0500 received badge  Popular Question (source)
2014-07-15 21:35:58 -0500 received badge  Famous Question (source)
2014-07-15 21:35:58 -0500 received badge  Notable Question (source)
2014-03-25 13:16:02 -0500 received badge  Nice Question (source)
2014-01-28 17:28:39 -0500 marked best answer How can I install openrave package for Fuerte?

I'm trying to install the openrave ROS package for Fuerte on Ubuntu 12.04, but I can't find it (nothing looking like ros-fuerte-openrave... in the available packages in the repositories).

The wiki page http://ros.org/wiki/openrave says "Cannot load information on name: openrave, distro: fuerte" so I don't know where should I get the source code from to install the package from source.

Does anyone know how could I install this package?

Edit: I've found this question http://answers.ros.org/question/32866/installing-openrave-package/ where the code of the openrave_planning (that happens to contain the package I need) is checked out from:

svn co https://jsk-ros-pkg.svn.sourceforge.net/svnroot/jsk-ros-pkg/trunk/openrave_planning

I assume this is the right repo, but still the http://www.ros.org/wiki/openrave_planning">wiki doesn't point to the right data, and no package for this stack exists in the ROS fuerte ubuntu repositories.

2014-01-28 17:27:51 -0500 marked best answer Access to ros_comm source code in fuerte

I've noticed that in Fuerte when you install the package ros-fuerte-ros-comm (and other low level ros packages) the installation follows the FHS layout, and the source code is no longer installed.

What is the easiest way to have access to this source code? (Somehow making sure that the source and my installed code are the same version).

2014-01-28 17:27:50 -0500 marked best answer Is it possible to call a ros service from another service's callback function in the same node?

I have a node advertising several services (A, B, C), and I want the service callback function for service A, to call services B and C. Is this allowed?

From my first tests it seems that this blocks something in the underlying service mechanism. Maybe the different services in a node are served sequentially using a single thread?

I know doing this may sound strange, and not the right way to do it, but this node is a quite complex etherCAT realtime node, and the C++ object that serves A doesn't have (easily) access to the objects that serve B and C.

2014-01-28 17:24:48 -0500 marked best answer Problem with the reference frame of grasps in the household_objects_database

I'm using the manipulation stack in ROS electric, in a simulated environment in Gazebo. I have a simulated robot arm and hand, a simulated table and a coke can, and a simulated kinect.

Object recognition is working well with the coke can, from the model in the household_objects_database (using the image from the simulated kinect).

But the problem is that when I want to use the object_manipulator, the grasps for that object are read from the database and represented in a weird and unexpected way.

The grasps (and pregrasps) for the coke can are in the grasp table of the DB with different positions but all of them have the identity quaternion (0,0,0,1) for orientation, so no rotation should be applied. But when trying to pickup the object with object_manipulator, the grasps are read from the database and the markers representing the grasps are more or less (but not exactly) in the expected position, but they have a completely unexpected orientation, as it can be seen in the following two images.

I thought that the parent frame of the grasps would be the frame of the recognised object (/object_frame in this case), but the orientation of the represented grasps is different from that.

Can anyone tell me what's the reference frame of the grasps, or what can I be doing wrong?

Screenshot of the grasps with no rotation

Screenshot of the grasps with no rotation. Top view

2014-01-28 17:23:09 -0500 marked best answer How to use NODELET_DEBUG()

Probably this is a stupid question, but I couldn't the answer in google or the ros wiki information about nodelets.

I'm using a launch file to launch a nodelet (in fact I've been using different nodelets) which is using NODELET_DEBUG(...) and NODELET_INFO(...) to output information. But in the console I only see the INFO output.

In http://www.ros.org/wiki/nodelet they say that "They also have the advantage that you can turn one specific nodelet into debug, instead of all nodelets of a specific type".

How can I see the DEBUG output? Is there a way to configure it in the launch file?

Thanks in advance

2014-01-28 17:23:04 -0500 marked best answer Openni problem in Debian. Kinect not detected

Hi,

I installed openni to work with a Kinect on a Debian Squeeze. I followed the instructions in http://www.ros.org/wiki/openni_kinect to build and install a package for debian, and after that I used the openni.launch to start working with kinect, but I get a message of "No devices detected".

The kinect seems to be detected by the system if I do lsusb (xbox NUI Motor?).

Can anybody help me? Thanks.

lsusb
Bus 009 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 008 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 005 Device 002: ID 045e:0040 Microsoft Corp. Wheel Mouse Optical
Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 004 Device 002: ID 045e:0752 Microsoft Corp. 
Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub
Bus 003 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 005: ID 045e:02b0 Microsoft Corp. Xbox NUI Motor
Bus 001 Device 003: ID 0409:005a NEC Corp. HighSpeed Hub
Bus 001 Device 002: ID 041e:4083 Creative Technology, Ltd Live! Cam Socialize [VF0640]
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub

The result of the launch is:

    roslaunch openni_launch openni.launch
... logging to /home/toni/.ros/log/d5cdbf70-0472-11e1-9ff1-bcaec5bdea73/roslaunch-hugoros-3696.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
WARNING: disk usage in log directory [/home/toni/.ros/log] is over 1GB.
It's recommended that you use the 'rosclean' command.

started roslaunch server http://10.1.1.236:45159/

SUMMARY
========

PARAMETERS
 * /rosdistro
 * /camera/driver/rgb_frame_id
 * /camera/driver/rgb_camera_info_url
 * /camera/depth_registered/rectify_depth/interpolation
 * /camera/driver/depth_frame_id
 * /camera/depth/rectify_depth/interpolation
 * /rosversion
 * /camera/driver/device_id
 * /camera/driver/depth_camera_info_url

NODES
  /camera/depth/
    rectify_depth (nodelet/nodelet)
    metric_rect (nodelet/nodelet)
    metric (nodelet/nodelet)
    disparity (nodelet/nodelet)
    points (nodelet/nodelet)
  /camera/rgb/
    rectify_mono (nodelet/nodelet)
    rectify_color (nodelet/nodelet)
  /
    camera_nodelet_manager (nodelet/nodelet)
    camera_base_link (tf/static_transform_publisher)
    camera_base_link1 (tf/static_transform_publisher)
    camera_base_link2 (tf/static_transform_publisher)
    camera_base_link3 (tf/static_transform_publisher)
  /camera/
    driver (nodelet/nodelet)
    register_depth_rgb (nodelet/nodelet)
    points_xyzrgb_depth_rgb (nodelet/nodelet)
  /camera/ir/
    rectify_ir (nodelet/nodelet)
  /camera/depth_registered/
    rectify_depth (nodelet/nodelet)
    metric_rect (nodelet/nodelet)
    metric (nodelet/nodelet)
    disparity (nodelet/nodelet)

ROS_MASTER_URI=http://10.1.1.236:11311

core service [/rosout] found
process[camera_nodelet_manager-1]: started with pid [3714]
process[camera/driver-2]: started with pid [3715]
process[camera/rgb/rectify_mono-3]: started with pid [3716]
process[camera/rgb/rectify_color-4]: started with pid [3717]
process[camera/ir/rectify_ir-5]: started with pid [3718]
process[camera/depth/rectify_depth-6]: started with pid [3719]
process[camera/depth/metric_rect-7]: started with pid [3720]
process[camera/depth/metric-8]: started with pid [3721]
process[camera/depth/disparity-9]: started with pid [3722]
process[camera/depth/points-10]: started with pid [3723 ...
(more)
2014-01-28 17:23:03 -0500 marked best answer How to get a point cloud from kinect

Is there an easy way to get a segmented point cloud from the kinect based on one color?

2014-01-28 17:22:59 -0500 marked best answer Subscribe to a topic using an overloaded callback function

I get a compilation error when trying to subscribe to a ROS topic using an overloaded callback function. Look like the compiler doesn't know what function to pick.

Is there any way to solve it?

At compiling time I don't know which of several topics will my node subscribe to. The types of the possible topics are different, so I need different functions to receive the data. Is there any way of knowing the data type of a topic other than calling 'rostopic type [topic]' inside the node?

In the following snippet I want to subscribe to the "inputs" topic, but I don't know which topic will be mapped to it from the command line. I want to consider two possible topics with different data type "sr_robot_msgs::JointControllerState" and "pr2_controllers_msgs::JointControllerState", so how can I retrieve this data type information from the topic in runtime?

    MovementPublisher::MovementPublisher(double min_value, double max_value,
                                   double rate, unsigned int repetition, unsigned int nb_mvt_step, std::string controller_type)
{
...
sub_ = nh_tilde.subscribe("inputs", nb_mvt_step, &MovementPublisher::calculateErrorCallback, this);
}

void MovementPublisher::calculateErrorCallback(const sr_robot_msgs::JointControllerState::ConstPtr& msg)
{
double error = msg->set_point - msg->process_value;
ROS_DEBUG_STREAM("Error: " << error);
}

void MovementPublisher::pr2_calculateErrorCallback(const pr2_controllers_msgs::JointControllerState::ConstPtr& msg)
{
double error = msg->set_point - msg->process_value;
ROS_DEBUG_STREAM("Error: " << error);
}
2013-12-06 03:21:52 -0500 asked a question Are pr2_ethercat and pr2_ethercat_drivers going to be migrated to use the new ros_control?

I'd like to know if pr2_ethercat and pr2_ethercat_drivers are going to be migrated to use the new ros_control (http://wiki.ros.org/ros_control). And if so, when is it planned to happen?

2013-07-25 17:52:30 -0500 received badge  Nice Answer (source)
2013-07-25 17:36:34 -0500 received badge  Teacher (source)
2013-07-25 08:42:11 -0500 answered a question How to write a callback function for Int8Multiarray messages subscription

Bear in mind that array->data is a std:vector, not a c++ array, so you can't assign it to a int*.

And to access the elements in the vector inside the loop you need to do:

array->data[i]

2013-06-24 12:47:48 -0500 received badge  Notable Question (source)
2013-05-21 11:22:15 -0500 received badge  Famous Question (source)
2013-05-19 04:47:53 -0500 received badge  Famous Question (source)
2013-05-18 10:41:30 -0500 marked best answer Is there some filter in PCL to segment kinect point cloud based on color?

Hi. Is there some filter that allows to segment a kinect point cloud based on the rbg color of the points?

I've tried with a passthrough filter but it doesn't work with rgb, only with position. I've also found this answer explaining why.

http://www.pcl-users.org/Does-the-passthrough-filter-can-be-applied-in-PointXYZRGB-s-color-field-td3202813.html

Thanks in advance

2013-03-27 08:32:07 -0500 answered a question How do I set the goalID in actionlib?

When you invoke SendGoal in the simple action client, the GoalID and timestamp are autogenerated, and a GoalHandle object for that goal is stored.

This GoalHandle allows the action client to obtain information about the state of this goal, cancel it, etc (internally it is using the GoalID and timestamp, but you don't need to care about it).

The GoalID is defined in the autogenerated ActionGoal message that wraps the goal message.