ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange |
2021-07-12 18:56:12 -0500 | received badge | ● Guru (source) |
2021-07-12 18:56:12 -0500 | received badge | ● Great Answer (source) |
2021-02-11 08:41:05 -0500 | received badge | ● Good Answer (source) |
2020-09-04 07:18:44 -0500 | received badge | ● Nice Answer (source) |
2018-09-07 05:06:41 -0500 | received badge | ● Nice Answer (source) |
2018-09-06 15:48:15 -0500 | received badge | ● Nice Question (source) |
2017-09-19 16:21:15 -0500 | received badge | ● Nice Answer (source) |
2015-11-23 04:44:33 -0500 | received badge | ● Nice Question (source) |
2015-07-16 15:38:28 -0500 | received badge | ● Good Answer (source) |
2015-06-22 01:14:26 -0500 | marked best answer | Is sensor timestamp can modify robot_pose_ekf accuracy ? Hello, I'm using a visual odometry working on a workstation connected by wifi to the Turtlebot. I have synchronized the time in the workstation related to the turtlebot with: sudo ntpdate tutlebot_IP writed on one workstation terminal. But maybe it's still a time lag. Therefore my question is, If the /vo is published with a timestamp lag compared with odom and imu timestamp, is it possible to impact the accuracy of the robot_pose_ekf algorithm. related old topic: http://answers.ros.org/question/64441/robot_pose_ekf-wrong-output-with-good-input/ |
2015-05-29 13:16:40 -0500 | marked best answer | Marker long distance tracking with Kinect I would like to use the Kinect to detect my Turtlebot and have a better odometry. So the Kinect is located between two and five meters from the turtlebot.I tried so many package like ar_pose, visp_auto_tracker or ar_kinect. Right now the best is still ar_pose but only for less than one meter or two max. With ar_pose, something strange happens, that works better with a small pattern(80mm) than with a large (200mm). And that, even when I modify the parameter marker_width = 200. Other issue, I cannot use ar_pose with a Kinect image_mode equal to 1: SXGA_15Hz (1): 1280x1024@15Hz. That's too bad because image quality is a lot better. Does anyone have some tips to increase the tracking ? EDIT: Here is a part of my launch file related of ar_pose. I have printed a larger hiro pattern with fitting a A4 paper, it is a square as well. |
2015-05-29 13:12:35 -0500 | marked best answer | robot_pose_ekf wrong output with "good input" I'm not be able to understand why I get a wrong pose from robot_pose_ekf. I'm using a kinect on a table and connected to the workstation to find the turtlebot position orientation and publish the result in /vo. Kinect is tracking a pattern marker and find a pose. I want to use odometry and imu to have a better result and have a pose when kinect can't see the marker. My issue is /vo, /odom, and imu/data has similar orientation but ekf find something very different and I can't understand why. Here is my tf tree to see how I work: /vo: /odom /imu/data: and robot_pose_ekf/odom: And how I construct the odometry msg: (more) |
2015-05-01 07:00:26 -0500 | received badge | ● Popular Question (source) |
2015-05-01 07:00:26 -0500 | received badge | ● Notable Question (source) |
2014-09-14 12:50:03 -0500 | received badge | ● Famous Question (source) |
2014-08-17 20:46:26 -0500 | received badge | ● Enlightened (source) |
2014-07-21 06:49:09 -0500 | received badge | ● Famous Question (source) |
2014-07-16 01:32:39 -0500 | received badge | ● Taxonomist |
2014-05-07 00:06:23 -0500 | received badge | ● Famous Question (source) |
2014-04-18 10:52:11 -0500 | received badge | ● Famous Question (source) |
2014-01-31 13:47:15 -0500 | received badge | ● Famous Question (source) |
2014-01-28 17:31:13 -0500 | marked best answer | Turtlebot sensor defective or fine ? Hello, I have a Turtlebot (create base) and it' very difficult to localize it because sensors seems pretty bad. I would like to know if my Turtlebot or my data are fine or if there is a problem somewhere. First, I cannot use directly JointState because the robot publishes always zero values. This is working only in gazebo simulation but not on the real robot. Is there some parameters to set ? Secondly, odom sends me a angular velocity null when my robot is moving slowly even through wheels are moving (no slipping !). Encoder are on the wheel so it should be send me a velocity even with bad encoder. Does my robot encoders are defective ? Thirdly, imu is moving itself when robot is stationary. I agree that is common for imu in position because it's a integration but I have also a velocity non null and bad results with. I have done calibration for imu and odom, I can understand that sensors are not hight quality but what I see it's too much important to work fine. I see a lot of great videos with Turtlebot so I was wondered if these errors are normal or not. |
2014-01-28 17:30:54 -0500 | marked best answer | libtf.so cannot open shared object file I was using a package done by myself on fuerte and I have moved this package on Turtlebot which is on electric. But it's not working on electric, I have a error message when I execute roslaunch mypackage mylaunchfile.launch: The rosmake works without error and I have libtf.so in my directory /geometry/tf/lib If I write: ldd bin/vision_odometrie libtf.so => /opt/ros/electric/stacks/geometry/tf/lib/libtf.so (0x00007f356cd0f000) I have added this line on my .bashrc but that doesn't solve the problem: export LD_LIBRARY_PATH="/opt/electric/stacks/geometry/tf/lib.libtf.so" ( I cannot do all the update because the linux version is old: Lucid Lynx 10.04LTS ) |
2014-01-28 17:30:38 -0500 | marked best answer | Delay in tf broadcaster robot_pose_ekf I'm trying to run Turtlebot using a fixed Kinect in the room, but I have some Tf issues in rviz. I cannot visualize robot Tf after base_footprint but I can see the link by: $rosrun tf view_frames $rosrun tf tf_echo /odom /base_footprint I can see some delay issues by tf_monitor: RESULTS: for all Frames (more) |
2014-01-28 17:27:04 -0500 | marked best answer | Improve quality of octomap with params Hi, I have built a octomap with ocotmap_server. It's a map with mobile objects so it's necessary to actualize frequently. Unfortunately some octomap points doesn't remove themselves. So I look for better parameters in the launch file. I have well set latch like true but anything change. I want modify "sensor_model/hit, sensor_model/miss, sensor_model/min and sensor_model/max". I have seen the topic Octomap_server but I don't understand descriptions of parameters. what does you mean each of them and How I should modify this parameters to select fewer points ? |
2014-01-28 17:26:56 -0500 | marked best answer | move_arm how ignore his arm to collision I know my title is not clear but it's difficult to explain. So I use a wizard description package with a robot arm (Package Arm navigation). I can send a goal(pose and articular) and the robot follow a trajectory I use also a kinect to generate a pointcloud and then a octomap_collision. My problem it's that the kinect detects the robot like a obstacle and so stops directly. I can see that there are cubes of collision_map in the robot. How can I do to avoid this mistake ? The environment server can understand that this points are the robot ? I think that something exists because it's a very recurrent problem in grasping of object for example. I don't know if it's more clear. if it isn't, post a question in comment please. Thanks |
2014-01-28 17:26:38 -0500 | marked best answer | Create collision map from Kinect with nodelet Hi, I use a camera Kinect for create a collision map but currently it's too slow and use 100% CPU. So I want use nodelets to avoid useless copies of pointCloud and speed up the process. I understand how I should use nodelet with openni_launch because I have a good explaining here: Get start with ROS Nodelet My question is How create a collision map with nodelet ? Currently I use octomap_server to construct the map but it's not a nodelet. It's necessary to modify the package ocotomap_server to porting nodes to nodelets ? Or it's possible to run this node as nodelets with a change in the launch file ? Or existing other package to construct collision map with nodelets ? Thanks EDIT: I have tried with two computer: (With system Monitor) The first work with two CPU at 100%: Ubuntu 10.04 / Ros Electric / Memory 3.9 GiB / Two processor Intel Core(TM)2 6600 @ 2.40GHz The second work well, it have only a CPU at 100%, it have 8 CPU: Unbuntu 12.04 / Ros Electric / Memory 15.7 GiB / Height processor Intel(R)Xeon(R) W3565 @ 3.20 GHz EDIT 2: New subject on how create this node: Build a octomap_server nodelet |
2014-01-28 17:26:29 -0500 | marked best answer | Add mobil object in collision_map to arm_navigation Hi everybody, Situation I use the package arm_navigation with ros-electric to control a arm robot with seven degree of freedom. I can send a articular or Cartesian goal, the robot moves well and avoid the object from urdf file like the table where is the robot. I use the "robot_name"_arm_navigation.launch and a robot simulation which receive joint state y publish TF of urdf with the good position. Problematic I would use the kinect to add man body in the scene and more particularly at the collision map for the robot can avoid the body. I want to actualize it everytime. I tried three solutions. I want add the body as several cylinders which moves in "real time". Solution 1: I followed the tutorial Adding known objects to the collision environment to add cylinder from kinect information. In this code I just add in the topic collision_object 10 cylinders to modeling the body. Results 1: I succeeded to add correctly all cylinder for each body part. For example a arm it's modeling by a cylinder. The robot avoids well the body but just at the beginning of his movement. During the trajectory cylinders doesn't move and I can't check collisions. Question 1: How actualize the collision map for see cylinders moves ? Solution 2: It's a similar solution with the tutorial Adding Virtual Objects to the Planning Scene. The difference it's only that I don't use a topic to add objects but a service. I don't see others differences with the solution 1. Result 2: With this solution cylinders can move everytime. There isn't the same problem of solution 1. But There is a more important problem with it. This solution works only when I choose a coordinate system already present in the urdf and doesn't work with a coodinate system generated by Kinect like right_arm_1. I explain me, when I add a cylinder, I add in a coordinate system(TF) and in this case it's more easy to add cylinder directly in coordinate system of the body. And with right_arm_1 or openni_depth_frame as coordinate system the cylinders doesn't fix in this coordinate system but works perfectly with coordinate system created from urdf file. I added a link between a urdf coordinate system and openni_depth_frame but it's like if there isn't this link. Question 2: Why the environment_server doesn't understand the new TF of Kinect system ? How fix this ? Solution 3: We thought to add the body man directly in the urdf and modify directly the tf with a publication in /tf. Result3: it doesn't work because the simulation publish every time all urdf tf by /robot_st_pub. Only we can modify joint state. And it's more complicated to calculate joint state of the boy. Question 3: It's possible to use urdf to add and modify mobil object ? thank you for your help. if I should add details ask me. EDIT1: Hi, Thanks for ... (more) |
2014-01-28 17:26:07 -0500 | marked best answer | How make planning description wizard with a collada ? Hi, I have a collada file for my robot and no a urdf. I want use the planning_description_wizard to use arm navigation after. I know, I can't change my collada in a urdf but it's possible to use the collada for it ? I want use this tutorial: http://www.ros.org/wiki/arm_navigation/Tutorials/tools/Planning%20Description%20Configuration%20Wizard But this function no works with collada file: $ roslaunch planning_environment planning_description_configuration_wizard.launch urdf_package:=<your urdf="" package="" name=""> urdf_path:=<relative path=""> thanks, |
2013-12-11 23:44:49 -0500 | received badge | ● Notable Question (source) |
2013-11-18 19:27:28 -0500 | received badge | ● Notable Question (source) |
2013-10-17 21:49:54 -0500 | received badge | ● Notable Question (source) |