ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2021-04-06 19:08:20 -0500 | received badge | ● Taxonomist |
2019-03-22 13:18:43 -0500 | received badge | ● Famous Question (source) |
2018-11-30 04:44:00 -0500 | received badge | ● Notable Question (source) |
2018-11-30 04:44:00 -0500 | received badge | ● Popular Question (source) |
2018-11-01 11:41:38 -0500 | commented question | using pybind11 along with catkin I am not sure that would help, I tried many things for hours, and there would be pages of cmake/python codes and logs of |
2018-11-01 09:34:33 -0500 | asked a question | using pybind11 along with catkin using pybind11 along with catkin Hi, I am trying to create a python wrapper over a library generated in a catkin packag |
2018-07-09 18:26:13 -0500 | received badge | ● Famous Question (source) |
2018-03-14 11:36:17 -0500 | received badge | ● Notable Question (source) |
2018-02-26 05:47:46 -0500 | received badge | ● Popular Question (source) |
2018-02-04 02:53:46 -0500 | asked a question | depth_image_proc nodelet not publishing depth_image_proc nodelet not publishing I would like to register the point cloud of a kinect with an rgb image. In my se |
2017-08-09 06:45:33 -0500 | received badge | ● Famous Question (source) |
2017-08-09 06:45:33 -0500 | received badge | ● Notable Question (source) |
2017-06-26 19:37:36 -0500 | received badge | ● Famous Question (source) |
2017-06-17 22:42:26 -0500 | received badge | ● Famous Question (source) |
2017-05-17 04:22:39 -0500 | received badge | ● Notable Question (source) |
2017-05-17 04:22:39 -0500 | received badge | ● Popular Question (source) |
2017-05-08 20:41:52 -0500 | received badge | ● Famous Question (source) |
2016-12-09 21:06:40 -0500 | received badge | ● Notable Question (source) |
2016-12-09 14:14:02 -0500 | received badge | ● Popular Question (source) |
2016-11-28 02:19:19 -0500 | received badge | ● Notable Question (source) |
2016-11-27 13:19:30 -0500 | received badge | ● Scholar (source) |
2016-11-27 13:19:18 -0500 | received badge | ● Popular Question (source) |
2016-11-27 05:10:15 -0500 | received badge | ● Editor (source) |
2016-11-27 05:09:52 -0500 | asked a question | pcl_ros / tf : how to use latest transform instead of waiting ? When calling "pcl_ros::transformPointCloud" I get the error: I know this could be solved by calling "waitForTransform", but in this specific case speed is more important than precision. So I would like to use the latest available transform, even if a little off, rather than wait for the more suitable one. I.e. in the example above it would use the tf at 1225.021. Any way to do this ? |
2016-11-06 14:34:00 -0500 | asked a question | rospy: how to set a transform to TransformerROS ? After initializing a node, a declare a TransformerROS. but if I want to perform a transform between two frames (in my case, I use turtlebot and I want to transform between odom and base_link), this does not work : with error: note: I have the bring up nodes of turtlebot running and rostopic informs me odom is broadcasted at 50Hz. Also odom frame shows up in rviz. Also, I notice : prints [] What am I missing ? Example from here: http://docs.ros.org/jade/api/tf/html/... seems to imply I should set the transform to the transformer first, but according to the error I get, the transformer seems to be searching for the transform itself (" ... self.lookupTransform(target_frame, hdr.frame_id, hdr.stamp) ...") edit : I tried with an instance of TransformListener rather than TransformROS, for same result |
2016-10-03 07:13:27 -0500 | received badge | ● Enthusiast |
2016-09-25 14:44:43 -0500 | asked a question | ros indigo / xtion : getting depth from pixel coordinates Hi, Giving x,y pixels coordinate in camera/rgb/image_raw, I am trying to extract the related depth from camera/depth_registered/image_raw. I start the xtion via "roslaunch openni2_launch openni2" Running "rosrun rqt_reconfigure rqt_reconfigure" and checking camera/driver confirms depth_registration is active. Also, displaying the /camera/depth_registered/points in rviz shows something that looks nice. It seems to me that when using an xtion pro live, there is no need for calibration, so I did not do any. It also seems to me that if registration is done, coordinates in the depth image and in the rgb image are the same, so I do the following in python: in the callback for the rgb image (bridge is an instance of CvBridge): I run some code that displays the image and shows the pixel x,y: this confirms x,y are correct, in this case at the center of a colored ball in front of the camera callback for depth image: but I just get "0.0" for the depth, no matter where the ball is in the field of vision. Anything I am doing incorrectly ? |
2016-07-27 15:25:16 -0500 | commented question | turtlebot visualization tutorial : missing step ? Ok, my very bad. I naively thought a xtion was a xtion, discovered a live and a live pro are not the same thing, that I had a pro, therefore no rgb image ... --; Sorry for the bother |
2016-07-26 03:09:52 -0500 | received badge | ● Popular Question (source) |
2016-07-22 15:25:33 -0500 | asked a question | turtlebot visualization tutorial : missing step ? I am a beginner with the turtlebot, and I am getting through the tutorial. Everything went smoothly until visualization : http://wiki.ros.org/turtlebot/Tutoria... "roslaunching" 3dsensor.lauch results in : RViz starts correctly and shows the robot model and the tf tree. But nothing else. Status of Laserscan, DepthCloud,Registered Depth, PointCloud, Registered PointCloud are all "Ok". I checked the topic /camera/depth/image_raw and it was being broadcast at 30Hz. Yet, none if this is displayed. The status of "Image" is "Warn" (no image received). Looking for this I found that maybe this may be due to me not calibrating the xtion. ( http://wiki.ros.org/camera_calibration ). I am a little dubious because the tutorial does not mention this (or did I miss it ?). Also since the depth images are broadcasted, it seems strange they are not being displayed. Anything I may be missing ? (ubuntu 14.04, ros indigo, kobuki base, PrimeSense) |
2015-07-22 16:08:13 -0500 | commented answer | Creating ikfast solution cpp for nao Thanks a lot for your answer ! After installing the latest version of sympy, the error message changed: [...]RuntimeWarning: invalid value encountered in divide axisangle /= angle [...] TypeError: not all arguments converted during string formatting |
2015-07-19 08:38:10 -0500 | answered a question | Creating ikfast solution cpp for nao Hi, I followed a very similar process to also get the left arm solver. When running : I get (some lines skipped) I also checked I have the correct version of sympy installed. Any idea what could be wrong ? |
2015-07-19 08:28:55 -0500 | received badge | ● Supporter (source) |