ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

vincent78's profile - activity

2021-04-06 19:08:20 -0500 received badge  Taxonomist
2019-03-22 13:18:43 -0500 received badge  Famous Question (source)
2018-11-30 04:44:00 -0500 received badge  Notable Question (source)
2018-11-30 04:44:00 -0500 received badge  Popular Question (source)
2018-11-01 11:41:38 -0500 commented question using pybind11 along with catkin

I am not sure that would help, I tried many things for hours, and there would be pages of cmake/python codes and logs of

2018-11-01 09:34:33 -0500 asked a question using pybind11 along with catkin

using pybind11 along with catkin Hi, I am trying to create a python wrapper over a library generated in a catkin packag

2018-07-09 18:26:13 -0500 received badge  Famous Question (source)
2018-03-14 11:36:17 -0500 received badge  Notable Question (source)
2018-02-26 05:47:46 -0500 received badge  Popular Question (source)
2018-02-04 02:53:46 -0500 asked a question depth_image_proc nodelet not publishing

depth_image_proc nodelet not publishing I would like to register the point cloud of a kinect with an rgb image. In my se

2017-08-09 06:45:33 -0500 received badge  Famous Question (source)
2017-08-09 06:45:33 -0500 received badge  Notable Question (source)
2017-06-26 19:37:36 -0500 received badge  Famous Question (source)
2017-06-17 22:42:26 -0500 received badge  Famous Question (source)
2017-05-17 04:22:39 -0500 received badge  Notable Question (source)
2017-05-17 04:22:39 -0500 received badge  Popular Question (source)
2017-05-08 20:41:52 -0500 received badge  Famous Question (source)
2016-12-09 21:06:40 -0500 received badge  Notable Question (source)
2016-12-09 14:14:02 -0500 received badge  Popular Question (source)
2016-11-28 02:19:19 -0500 received badge  Notable Question (source)
2016-11-27 13:19:30 -0500 received badge  Scholar (source)
2016-11-27 13:19:18 -0500 received badge  Popular Question (source)
2016-11-27 05:10:15 -0500 received badge  Editor (source)
2016-11-27 05:09:52 -0500 asked a question pcl_ros / tf : how to use latest transform instead of waiting ?

When calling "pcl_ros::transformPointCloud" I get the error:

[ERROR] [1480244411.676958769, 1224.030000000]: Lookup would require extrapolation into the future.  Requested time 1224.022000000 but the latest data is at time 1224.021000000, when looking up transform from frame [base_laser_link] to frame [odom]

I know this could be solved by calling "waitForTransform", but in this specific case speed is more important than precision. So I would like to use the latest available transform, even if a little off, rather than wait for the more suitable one. I.e. in the example above it would use the tf at 1225.021.

Any way to do this ?

2016-11-06 14:34:00 -0500 asked a question rospy: how to set a transform to TransformerROS ?

After initializing a node, a declare a TransformerROS.

transformer = tf.TransformerROS(True,rospy.Duration(10.0))

but if I want to perform a transform between two frames (in my case, I use turtlebot and I want to transform between odom and base_link), this does not work :

transformed_point = transformer.transformPoint("odom",point)

with error:

[...]
File "/opt/ros/indigo/lib/python2.7/dist-packages/tf/listener.py", line 71, in asMatrix
translation,rotation = self.lookupTransform(target_frame, hdr.frame_id, hdr.stamp)
[...]
LookupException: "odom" passed to lookupTransform argument target_frame does not exist

note: I have the bring up nodes of turtlebot running and rostopic informs me odom is broadcasted at 50Hz. Also odom frame shows up in rviz.

Also, I notice :

 transformer.getFrameStrings()

prints []

What am I missing ?

Example from here: http://docs.ros.org/jade/api/tf/html/... seems to imply I should set the transform to the transformer first, but according to the error I get, the transformer seems to be searching for the transform itself (" ... self.lookupTransform(target_frame, hdr.frame_id, hdr.stamp) ...")

edit : I tried with an instance of TransformListener rather than TransformROS, for same result

2016-10-03 07:13:27 -0500 received badge  Enthusiast
2016-09-25 14:44:43 -0500 asked a question ros indigo / xtion : getting depth from pixel coordinates

Hi,

Giving x,y pixels coordinate in camera/rgb/image_raw, I am trying to extract the related depth from camera/depth_registered/image_raw. I start the xtion via "roslaunch openni2_launch openni2"

Running "rosrun rqt_reconfigure rqt_reconfigure" and checking camera/driver confirms depth_registration is active. Also, displaying the /camera/depth_registered/points in rviz shows something that looks nice.

It seems to me that when using an xtion pro live, there is no need for calibration, so I did not do any.

It also seems to me that if registration is done, coordinates in the depth image and in the rgb image are the same, so I do the following in python:

in the callback for the rgb image (bridge is an instance of CvBridge):

cv_image = bridge.imgmsg_to_cv2(image, image.encoding)
# code for getting x,y coordinate of a pixel of interest in the image

I run some code that displays the image and shows the pixel x,y:

cv2.circle(cv_image,(x,y),2,(255,0,0),-1)

this confirms x,y are correct, in this case at the center of a colored ball in front of the camera

callback for depth image:

image_cv = bridge.imgmsg_to_cv2(depth_image, depth_image.encoding)
depth_image = np.squeeze(np.array(image_cv, dtype=np.float32))
depth = float(depth_image[x][y])

but I just get "0.0" for the depth, no matter where the ball is in the field of vision.

Anything I am doing incorrectly ?

2016-07-27 15:25:16 -0500 commented question turtlebot visualization tutorial : missing step ?

Ok, my very bad. I naively thought a xtion was a xtion, discovered a live and a live pro are not the same thing, that I had a pro, therefore no rgb image ... --; Sorry for the bother

2016-07-26 03:09:52 -0500 received badge  Popular Question (source)
2016-07-22 15:25:33 -0500 asked a question turtlebot visualization tutorial : missing step ?

I am a beginner with the turtlebot, and I am getting through the tutorial. Everything went smoothly until visualization : http://wiki.ros.org/turtlebot/Tutoria...

"roslaunching" 3dsensor.lauch results in :

[ INFO] [1469218497.427352736]: Initializing nodelet with 4 worker threads.
[ INFO] [1469218499.896268578]: Device "1d27/0601@1/32" found.
[ERROR] [1469218500.368594755]: Unsupported color video mode - Resolution: 640x480@30Hz Format: RGB888
[ INFO] [1469218501.848436029]: Starting depth stream.
[ INFO] [1469218502.347021890]: using default calibration URL
[ INFO] [1469218502.347443168]: camera calibration URL: file:///root/.ros/camera_info/rgb_PS1080_PrimeSense.yaml
[ INFO] [1469218502.348959188]: Unable to open camera calibration file [/root/.ros/camera_info/rgb_PS1080_PrimeSense.yaml]
[ WARN] [1469218502.349194447]: Camera calibration file /root/.ros/camera_info/rgb_PS1080_PrimeSense.yaml not found.
[ERROR] [1469218503.085167492]: Rectified topic '/camera/depth/image_rect_raw' requested but camera publishing '/camera/depth/camera_info' is uncalibrated
[ WARN] [1469218512.948623954]: [image_transport] Topics '/camera/depth/image_rect_raw' and '/camera/depth/camera_info' do not appear to be synchronized. In the last 10s:
    Image messages received:      0
    CameraInfo messages received: 300
    Synchronized pairs:           0

RViz starts correctly and shows the robot model and the tf tree. But nothing else. Status of Laserscan, DepthCloud,Registered Depth, PointCloud, Registered PointCloud are all "Ok". I checked the topic /camera/depth/image_raw and it was being broadcast at 30Hz. Yet, none if this is displayed.

The status of "Image" is "Warn" (no image received).

Looking for this I found that maybe this may be due to me not calibrating the xtion. ( http://wiki.ros.org/camera_calibration ). I am a little dubious because the tutorial does not mention this (or did I miss it ?). Also since the depth images are broadcasted, it seems strange they are not being displayed.

Anything I may be missing ?

(ubuntu 14.04, ros indigo, kobuki base, PrimeSense)

2015-07-22 16:08:13 -0500 commented answer Creating ikfast solution cpp for nao

Thanks a lot for your answer ! After installing the latest version of sympy, the error message changed: [...]RuntimeWarning: invalid value encountered in divide axisangle /= angle [...] TypeError: not all arguments converted during string formatting

2015-07-19 08:38:10 -0500 answered a question Creating ikfast solution cpp for nao

Hi, I followed a very similar process to also get the left arm solver. When running :

python `openrave-config --python-dir`/openravepy/_openravepy_/ikfast.py --robot=/home/vincent/nao/nao.dae --iktype=translationdirection5d --baselink=1 --eelink=35 --savefile=ikfast61_arm_hand_left.cpp

I get (some lines skipped)

[...]
INFO: ikfast translation direction 5d: [j8, j9, j10, j11, j12]
INFO: attempting li/woernle/hiller general ik method
INFO: special structure in equations detected, try to solve through elimination
INFO: [] [j8]
INFO: 'failed to find a variable to solve'
INFO: [] [j9]
INFO: 'failed to find a variable to solve'
INFO: try to solve first two variables pairwise
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/openravepy/_openravepy_/ikfast.py", line 6121, in <module>
[...]
  File "/usr/lib/python2.7/dist-packages/sympy/core/expr.py", line 129, in __float__
    raise ValueError("Symbolic value, can't compute")
ValueError: Symbolic value, can't compute

I also checked I have the correct version of sympy installed. Any idea what could be wrong ?

2015-07-19 08:28:55 -0500 received badge  Supporter (source)