ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2022-02-26 12:16:36 -0500 | received badge | ● Guru (source) |
2022-02-26 12:16:36 -0500 | received badge | ● Great Answer (source) |
2017-02-07 19:57:16 -0500 | received badge | ● Good Answer (source) |
2017-02-07 19:57:16 -0500 | received badge | ● Enlightened (source) |
2016-03-14 08:28:10 -0500 | received badge | ● Nice Answer (source) |
2013-10-30 11:55:44 -0500 | commented question | How to create a header declaration for the following function Sorry, I meant the " #include <ros ros.h="">" and those |
2013-10-30 10:24:39 -0500 | commented question | How to create a header declaration for the following function Where do you include the ROS headers? |
2013-10-16 16:07:25 -0500 | commented answer | nao control rate you only need to use the spin() or spinOnce() if you have subscribed to a topic. The loop_rate.sleep() performs the delay you require in this situation. |
2013-10-10 02:58:37 -0500 | commented answer | extract SensorMessages/LaserScan from SensorMessages/PointCloud2 created in simulation Great! See updated answer. I forgot that you could set standalone as an arg rather than loading a standalone nodelet manager. |
2013-10-08 13:46:50 -0500 | commented answer | Storing kinect depth data from turtlebot in an array. Yes, if you just need the depth at a single coordinate in the depth image, this will work fine. |
2013-10-08 13:45:00 -0500 | received badge | ● Commentator |
2013-10-08 13:45:00 -0500 | commented question | Unused argument problem when launching /etc/ros/openni_head.launch on PR2 Hydro Is there a reason you are not using "openni_launch openni.launch" similar to here: https://github.com/turtlebot/turtlebot/blob/hydro-devel/turtlebot_bringup/launch/3dsensor.launch |
2013-10-08 13:21:46 -0500 | answered a question | Storing kinect depth data from turtlebot in an array. This function should do the trick and accounts for many cases, such as endianness and the raw and rectified depth image. Of course, if you know the endianness of your system (and the image endianness doesn't change) as well as which depth image you are subscribed to (raw or rect) then this can be simplified a lot. Note: the function returns the depth in millimeters. Hopefully this helps. |
2013-10-08 13:04:11 -0500 | answered a question | extract SensorMessages/LaserScan from SensorMessages/PointCloud2 created in simulation I believe you can run the pointcloud_to_laserscan for any Sensor_msgs/PointCloud2 data by the following (in this case I've matched what you have put above for the vrep): Adjust the "output_frame_id", "min_height", and "max_height" as necessary for the vrep Sensor_msgs/PointCloud2 data. This tutorial has information on nodelets and running standalone nodelets and this turtlebot file shows an example for how to run the pointcloud_to_laserscan nodelet. The code I have above follow these. Hopefully this helps. |
2013-10-07 16:04:10 -0500 | commented answer | How to retrieve data from Kinect using rviz Agreed, lsusb probably wouldn't have returned that Kinect's were connected if that was the issue. Take a look at the bottom half of the answer by dan in this answer: http://answers.ros.org/question/9252/openni_camera-compiling-problem/ The USB permissions might be wrong or libusb issue. |
2013-10-07 13:45:23 -0500 | commented question | extract SensorMessages/LaserScan from SensorMessages/PointCloud2 created in simulation you may just need to adjust some parameters. The pointcloud_to_laserscan essentially just takes a horizontal slice from the point cloud and makes that the "laser scan". The points in the point cloud could be out of that height range. |
2013-10-07 13:14:28 -0500 | answered a question | How can I subscribe joint_controller/state from dynamixel In the callback should do the trick. Also note, that this: should be this: The dynamixel controller Joint State message documentation is here. If you want to access any other member, just call "data.[member name goes here]". Hope this helps. |
2013-10-07 12:59:41 -0500 | answered a question | nao control rate You could use ros::Rate() The "loop_rate" input is a frequency in Hertz so you can adjust that as needed. See this tutorial for more details. |
2013-10-06 17:13:34 -0500 | answered a question | how to edit the ~/.bashrc file To edit it:
and add that line. You could also use (as SaiHV answered)
Make sure to
after editing it. |
2013-10-06 14:28:47 -0500 | commented answer | How to retrieve data from Kinect using rviz I'm out of ideas. Those are all the issues that I've ran into with the Kinect. The issue is ROS detecting the Kinect it seems though. |
2013-10-06 14:11:06 -0500 | commented answer | How to retrieve data from Kinect using rviz Try: "killall XnSensorServer" then "roslaunch openni_launch openni.launch" and see if you get the same error as before. |
2013-10-06 12:35:35 -0500 | commented answer | How to retrieve data from Kinect using rviz Try running the command "lsusb" on the computer that the Kinect is plugged into. There should be three USB devices listed that are associated with the Kinect. |
2013-10-06 11:38:41 -0500 | received badge | ● Teacher (source) |
2013-10-06 11:23:17 -0500 | received badge | ● Editor (source) |
2013-10-06 11:22:31 -0500 | answered a question | Any one knows how to apply the camera calibration matrix to OpenNI If you are using "openni_launch"
with rgb_camera_info_url and depth_camera_info_url arguments. See openni_launch documentation: http:// wiki.ros.org/openni_launch#Arguments |
2013-10-06 10:42:04 -0500 | commented question | Using two roscreate-pkg packages and being able to read the src file code This ROS answer (http://answers.ros.org/question/33779/how-do-i-link-an-executable-with-library-in-other-package/) covers all the things you have to do. |
2013-10-06 10:29:28 -0500 | commented answer | How to retrieve data from Kinect using rviz When you launch "turtlebot_bringup 3dsensor.launch", does it report any errors |
2013-10-06 10:25:03 -0500 | commented answer | How to retrieve data from Kinect using rviz I believe in ROS fuerte a breaker (breaker 1 possibly?) had to be enabled to have the turtlebot base power the kinect, but I'm not sure if this is the case in groovy and later. Also, the Kinect doesn't work with USB3.0 so make sure it is plugged into a 2.0 port. |
2013-10-06 10:22:29 -0500 | commented question | Using two roscreate-pkg packages and being able to read the src file code See this ROS answer. You have to export cflags and lflags in package a in your case mentioned. |
2013-10-06 10:14:56 -0500 | received badge | ● Supporter (source) |
2013-10-06 09:33:59 -0500 | commented question | I use this syntax "roscreate-pkg pkgname depend1 depend2 depend3" ...but the depends are not getting created Do you mean that the depends are not put into the manifest? |