ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

TomTUM's profile - activity

2018-08-10 15:43:23 -0500 received badge  Good Answer (source)
2017-09-29 13:11:51 -0500 received badge  Nice Answer (source)
2017-03-22 10:28:59 -0500 received badge  Taxonomist
2015-05-01 16:08:23 -0500 received badge  Great Answer (source)
2015-05-01 16:08:23 -0500 received badge  Guru (source)
2014-01-28 17:24:51 -0500 marked best answer tf_conversions / posemath / pykdl problem with fromMsg

There seems to be a bug in the python kdl wrapper, leading to unexpected behavior when working on KDL frames. I felt like posting it here since other people might run into it before it's fixed, and they would most probably look for a solution here.

The simple remedy is of course to assign whatever is returned from fromMsg to a variable before usage.

import roslib; roslib.load_manifest('tf')
roslib.load_manifest('tf_conversions')
import tf_conversions.posemath as pm
roslib.load_manifest('geometry_msgs')
import geometry_msgs
pose_msg = geometry_msgs.msg.Pose()
pose_msg.position.x = 100
print pm.fromMsg(pose_msg).p
#[           0,           0,           0]
pose = pm.fromMsg(pose_msg)
print pose.p
#[         100,           0,           0]
2013-11-19 01:47:45 -0500 received badge  Nice Answer (source)
2013-03-14 04:44:26 -0500 received badge  Necromancer (source)
2013-03-04 19:40:04 -0500 answered a question Manipulating Position of Conveyor Belt relative to platform in Gazebo and applying body wrench

I can´t tell you why this doesn't work in this case, but as a general remark: Depening on what you're trying to achieve I guess going the way that was proposed in the other question could be a much better idea. What you would usually want to do is to write some code for your conveyor that monitors contacts and excerts forces on objects in touch. That way your conveyor stays where it is, which makes a lot more sense in larger scenarios with multiple conveyors. I have done similar stuff years ago B21 ICIM Factory. There we had palletized conveyors which meant i was setting up a rigid link between the pallet and the conveyor until the pallet runs into the end of the conveyor where the link would be destroyed and the next conveyor takes over. So this is not a physical simulation, but it was working to model the factory at the level we needed.

2013-02-25 04:06:32 -0500 received badge  Good Answer (source)
2012-09-21 06:41:49 -0500 marked best answer How to locally allow object/gripper collision in electric ?

A) I ran into some trouble when trying to check robot states for validity in a collision aware grasping scenario.

It seems like one can publish a collision_object on the /collision_object topic and the planning scene will show this object and additionally cut the shape out from the collision_map. (This however does not always seem to work, it could however be that rviz is the problem when it comes to displaying the changed state.)

But now, when i run a state validity check similar to what is in the pr2_arm_navigation_tutorials, this seems to give the collision_map a reset and the cubes created by the object are back in the collision map.

Adding the object in the planning_scene_diff request as collision_object doesn't seem to help either: I still get collisions in the area of the object with collision_map and not the object, so the object seems not to be cut out. (If i would get collisions with the object and not the map, i could then additionally disable collisions with it via a CollisionOperation).

What am I overlooking?

B) On a related topic: how can i get the collision free ik to work with the desired state?

It seems like the query via planning_scene_diff alters the environment state used by the collision free ik service (*), is it also possible to use the service with different environent states concurrently?

It seems like the paradigm of checking state validity locally does not yet affect the coll. free ik service yet, is this functionality planned to be or already available like checking state validity?

(*) i can for example allow the gripper to collide with the collision_map globally.

2012-09-11 17:58:17 -0500 received badge  Notable Question (source)
2012-09-11 17:58:17 -0500 received badge  Popular Question (source)
2012-09-11 17:58:17 -0500 received badge  Famous Question (source)
2012-08-17 08:56:00 -0500 received badge  Notable Question (source)
2012-08-17 08:56:00 -0500 received badge  Famous Question (source)
2012-08-17 08:56:00 -0500 received badge  Popular Question (source)
2012-05-01 07:21:59 -0500 received badge  Nice Answer (source)
2012-04-30 20:51:28 -0500 asked a question How can the IMU calibration be manually triggered on the PR2

We often see errors in the robot orientation w.r.t. the ground plane in the order of 4-5 degrees. Planes such as tables in sensor data are then shown with a slope. Driving around violently usually leeds to this error going away, however, i would prefer to have a solution that can be done without moving the robot.

Any suggestions?

2012-04-30 19:40:16 -0500 answered a question AMCL odom_alpha parameters

I think you got the interpretation of the values right.

First of all, you should tune the parameters using a bag file, since you can not do anything repeatable on a live system. Record the sensors and the odometry and watch what's happening in rviz playing the bag with amcl running.

Closely look at what happens to your PoseArray and see where and how the robot gets lost. If there is a displacement when the robot turns, it's odom_alpha4 etc.

"Too high" values are usually not a big problem since this will spread out the samples more and you still have a good one in the sampleset. If the values are too low, however, the actual pose of the robot will often not be reflected anymore by any of the samples. (never underestimate the error!)

So when in doubt, raise the values. On the PR2 for example the robot will still localize quite well when going from values between 0-1 to values around 10. It could use more up computation time though, since more samples will be used.

2012-04-28 09:18:54 -0500 received badge  Good Answer (source)
2012-04-28 09:18:54 -0500 received badge  Enlightened (source)
2012-04-23 11:06:06 -0500 received badge  Nice Answer (source)
2012-04-23 09:23:11 -0500 answered a question TF handling frame uncertainty?

You can find my first prototype for a tf extension allowing to define uncertainty in tf frames can be found at http://www.ros.org/wiki/uncertain_tf. So far it supports sampling through the tf chain where each frame might or might not have a covariance attached. Additionally one can sample over time (such as when the precise time at which sensor data was captured is not known and one wants to know how this might affect the data). One can also calculate a covariance for a set of transforms, either to get a different look at generated sample sets, or simply to know what to best put into the covariance matrices to represent the uncertainty experienced from empiric data.

I plan to add using the possibility to use sample set to start from (such as when you have data and want to transform it to another frame, including uncertainty on the chain) and also to add another set of interfaces that would use the unscented transformation instead of sampling (might be less accurate but faster).

I stepped away from the idea to directly use child frames as sigma points as covariances should be fine to define the uncertainty locally in a frame. The usage of euler angles might have some mathematical downsides but it seems to be widely used in practice, such as in tolerance calculations in mechanical engineering etc. I am currently looking into some alternatives but most people tell me it's fine like it is for practical purposes.

Would be great to get some feedback, both on the usability of the package and the documentation.

Regarding fusion of data, i guess i will not get to that too soon, so if somebody has something to bring in and share, i would love to help with the integration. As a first step, we might want to address filtering over time.

2012-04-17 07:15:32 -0500 answered a question Combine image and pose in custom msg

There are some downsides to creating your own derived message type, mostly regarding the usage of tools such as rviz or image_view for showing the image will not work on these. One possibility is to maybe abuse the seq field in the image header for user data such as the robot id. Also the frame id may contain the name of the robot such as /robot1/kinect_optical_frame or such.

Whether you would send the Pose on another topic or via tf, you can always make sure you publish it at the exact same time as the image message has in its stamp and then synchronization should not be hard later.

If you don't care about viewing the images on the fly, from bagfiles or such, I would probably go with a new derived message type as proposed in Bence's answer.

It's actually a pity that composed messages seem not to be transparent when subscribing in ROS. It would be great if you could have a topic "/detections" publishing re_kinect_object_detector/DetectionResult and then a subscriber on "/detections/Image" that would then look for an sensor_msgs/Image. Like this, an image topic in rviz could still view the nested image. rostopic echo does a bit of this magic, like you could say rostopic echo /a_topic_publishing_posestampeds/pose/position/x and get the x-es of all messages displayed.

2012-03-30 02:03:52 -0500 received badge  Good Answer (source)
2012-03-30 00:51:48 -0500 received badge  Nice Answer (source)
2012-03-29 23:14:47 -0500 answered a question TF handling frame uncertainty?

I have some approach but didn't get to implement it yet. The idea is to store the pose uncertainty similar to a Sigma-Point Approximation in tf child-frames of the frame that is affected by uncertainty. One could then still use the normal tf listeners etc, giving him the most probable coordinates, as the main frame would represent the mean of the pose distribution. I would then implement a new class inheriting from the tf listener that when queried checks for the respective 'uncertainty' child frames along the chain connecting the queried frames and handles the uncertainty propagation. The listener would then return the transforms as usual (the mean) and additionally the uncertainty as a covariance matrix or such. I guess we could do this based on the pose_cov_ops package? Leaving out loop closures, it shouldn't be too hard. TF was not meant to handle circular frame connections afaik.

2012-03-15 06:16:54 -0500 answered a question Selecting joints by mouse click in gazebo

You can do this via rviz: a good place to start could be to look at how it's done for the PR2 teleop via interactive markers: pr2_marker_control

2012-03-15 05:46:52 -0500 commented question Transform PoseWithCovariance

You might want to read into "S. Su and C. Lee, “Manipulation and propagation of uncertainty and verification of applicability of actions in assembly tasks,” Systems, Man and Cybernetics, IEEE Transactions on, vol. 22, no. 6, pp. 1376– 1389, 1992."

2012-03-06 06:32:45 -0500 received badge  Nice Question (source)
2012-02-29 00:20:13 -0500 commented question tf_conversions / posemath / pykdl problem with fromMsg

Turns out it's a general problem with SIP. Whenever we have a struct C { Member m; } and access from from a an instantiation C().m, the instantiated object gets garbage collected while .m still points to the memory held by C++ directly. It seems like there is no way to do this right in SIP.

2012-02-22 01:09:06 -0500 received badge  Good Answer (source)
2012-02-21 23:31:38 -0500 received badge  Nice Answer (source)
2012-02-21 22:16:19 -0500 answered a question How to change ImageConstPtr data?

You should probably copy the data to another sensor_msgs::Image and then modify it, as you are supposed to read only from const data.

2012-02-17 01:54:33 -0500 commented question tf_conversions / posemath / pykdl problem with fromMsg

We can create this error using kdl: import PyKDL as kdl kdl.Frame(kdl.Rotation.Quaternion(0,0,0,1),kdl.Vector(100,200,300)).p Out[9]: [2.56735e-316,6.93179e-310, 300] kdl.Frame(kdl.Rotation.Quaternion(0,0,0,1),kdl.Vector(100,200,300)).p[0] Out[10]: 0.0 # which is not 100!