ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

jep31's profile - activity

2021-07-12 18:56:12 -0500 received badge  Guru (source)
2021-07-12 18:56:12 -0500 received badge  Great Answer (source)
2021-02-11 08:41:05 -0500 received badge  Good Answer (source)
2020-09-04 07:18:44 -0500 received badge  Nice Answer (source)
2018-09-07 05:06:41 -0500 received badge  Nice Answer (source)
2018-09-06 15:48:15 -0500 received badge  Nice Question (source)
2017-09-19 16:21:15 -0500 received badge  Nice Answer (source)
2015-11-23 04:44:33 -0500 received badge  Nice Question (source)
2015-07-16 15:38:28 -0500 received badge  Good Answer (source)
2015-06-22 01:14:26 -0500 marked best answer Is sensor timestamp can modify robot_pose_ekf accuracy ?

Hello,

I'm using a visual odometry working on a workstation connected by wifi to the Turtlebot. I have synchronized the time in the workstation related to the turtlebot with: sudo ntpdate tutlebot_IP writed on one workstation terminal. But maybe it's still a time lag.

Therefore my question is, If the /vo is published with a timestamp lag compared with odom and imu timestamp, is it possible to impact the accuracy of the robot_pose_ekf algorithm.

related old topic: http://answers.ros.org/question/64441/robot_pose_ekf-wrong-output-with-good-input/

2015-05-29 13:16:40 -0500 marked best answer Marker long distance tracking with Kinect

I would like to use the Kinect to detect my Turtlebot and have a better odometry. So the Kinect is located between two and five meters from the turtlebot.I tried so many package like ar_pose, visp_auto_tracker or ar_kinect. Right now the best is still ar_pose but only for less than one meter or two max.

With ar_pose, something strange happens, that works better with a small pattern(80mm) than with a large (200mm). And that, even when I modify the parameter marker_width = 200.

Other issue, I cannot use ar_pose with a Kinect image_mode equal to 1: SXGA_15Hz (1): 1280x1024@15Hz. That's too bad because image quality is a lot better.

Does anyone have some tips to increase the tracking ?

EDIT: Here is a part of my launch file related of ar_pose.

 <!-- Reading Pattern -->
  <node pkg="ar_pose" type="ar_single" respawn="true" name="pattern_reader">
    <remap from="/usb_cam/image_raw" to="/openni/rgb/image_color" />
    <remap from="/usb_cam/camera_info" to="/openni/rgb/camera_info" />
    <param name="marker_pattern" type="string" value="$(find ar_pose)/data/patt.hiro"/>
    <param name="marker_width" type="double" value="200"/>
    <name threshold="threshold" type="double" value="100"/>
  </node>

I have printed a larger hiro pattern with fitting a A4 paper, it is a square as well.

2015-05-29 13:12:35 -0500 marked best answer robot_pose_ekf wrong output with "good input"

I'm not be able to understand why I get a wrong pose from robot_pose_ekf. I'm using a kinect on a table and connected to the workstation to find the turtlebot position orientation and publish the result in /vo. Kinect is tracking a pattern marker and find a pose. I want to use odometry and imu to have a better result and have a pose when kinect can't see the marker.

My issue is /vo, /odom, and imu/data has similar orientation but ekf find something very different and I can't understand why.

Here is my tf tree to see how I work:image description

/vo:

header:    seq: 479   stamp: 
    secs: 1370526479
    nsecs: 1198623   frame_id: odom child_frame_id: base_footprint pose:   pose: 
    position: 
      x: 0.109848357611
      y: 0.0184851177529
      z: 0.00108856573807
    orientation: 
      x: -0.0155068317083
      y: 0.00236792210117
      z: -0.726064768505
      w: 0.687447367477   covariance: [0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 10000.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 10000.1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1e-06]

/odom

header: 
  seq: 5144
  stamp: 
    secs: 1370526478
    nsecs: 985132932
  frame_id: odom
child_frame_id: base_footprint
pose: 
  pose: 
    position: 
      x: 0.122778090562
      y: 0.0180749352772
      z: 0.0
    orientation: 
      x: 0.0
      y: 0.0
      z: -0.754709580223
      w: 0.656059028991
  covariance: [0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.001, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1000.0]

/imu/data:

header: 
  seq: 5154
  stamp: 
    secs: 1370526479
    nsecs: 318391084
  frame_id: gyro_link
orientation: 
  x: 0.0
  y: 0.0
  z: -0.641532116852
  w: 0.767096175878
orientation_covariance: [1000000.0, 0.0, 0.0, 0.0, 1000000.0, 0.0, 0.0, 0.0, 1e-06]

and robot_pose_ekf/odom:

header: 
  seq: 3340
  stamp: 
    secs: 1370526479
    nsecs: 618319034
  frame_id: odom
pose: 
  pose: 
    position: 
      x: 0.0979321075308
      y: -0.0259189421082
      z: -0.00496891294622
    orientation: 
      x: 3.68249858201e-06
      y: -4.82461122176e-05
      z: 0.970055184809
      w: -0.242884614753
  covariance: [5.493486503712613e-07, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.493486503712613e-07, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0987278074026108e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 10.877878440724317, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 10.877878440724317, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 5.493333772345698e-10]

And how I construct the odometry msg:

 void readAndSendVo()
  {
    tf::TransformListener lisVo;
    tf::StampedTransform transformVo;
    nav_msgs::Odometry vo;
    std ...
(more)
2015-05-01 07:00:26 -0500 received badge  Popular Question (source)
2015-05-01 07:00:26 -0500 received badge  Notable Question (source)
2014-09-14 12:50:03 -0500 received badge  Famous Question (source)
2014-08-17 20:46:26 -0500 received badge  Enlightened (source)
2014-07-21 06:49:09 -0500 received badge  Famous Question (source)
2014-07-16 01:32:39 -0500 received badge  Taxonomist
2014-05-07 00:06:23 -0500 received badge  Famous Question (source)
2014-04-18 10:52:11 -0500 received badge  Famous Question (source)
2014-01-31 13:47:15 -0500 received badge  Famous Question (source)
2014-01-28 17:31:13 -0500 marked best answer Turtlebot sensor defective or fine ?

Hello,

I have a Turtlebot (create base) and it' very difficult to localize it because sensors seems pretty bad. I would like to know if my Turtlebot or my data are fine or if there is a problem somewhere.

First, I cannot use directly JointState because the robot publishes always zero values. This is working only in gazebo simulation but not on the real robot. Is there some parameters to set ?

Secondly, odom sends me a angular velocity null when my robot is moving slowly even through wheels are moving (no slipping !). Encoder are on the wheel so it should be send me a velocity even with bad encoder. Does my robot encoders are defective ?

Thirdly, imu is moving itself when robot is stationary. I agree that is common for imu in position because it's a integration but I have also a velocity non null and bad results with.

I have done calibration for imu and odom, I can understand that sensors are not hight quality but what I see it's too much important to work fine. I see a lot of great videos with Turtlebot so I was wondered if these errors are normal or not.

2014-01-28 17:30:54 -0500 marked best answer libtf.so cannot open shared object file

I was using a package done by myself on fuerte and I have moved this package on Turtlebot which is on electric. But it's not working on electric, I have a error message when I execute roslaunch mypackage mylaunchfile.launch:

error while loading shared libraries: libtf.so: cannot open shared object file: No such file or directory 
process[vision_odometrie-1]: started with pid [30562] [vision_odometrie-1] process has died [pid 30562, exit code 127]. log files: /home/turtlebot/.ros/log/2827a526-d4ef-11e2-a445-047d7b6f6b57/vision_odometrie-1*.log

The rosmake works without error and I have libtf.so in my directory /geometry/tf/lib

If I write: ldd bin/vision_odometrie

libtf.so => /opt/ros/electric/stacks/geometry/tf/lib/libtf.so (0x00007f356cd0f000)

I have added this line on my .bashrc but that doesn't solve the problem:

export LD_LIBRARY_PATH="/opt/electric/stacks/geometry/tf/lib.libtf.so"

( I cannot do all the update because the linux version is old: Lucid Lynx 10.04LTS )

2014-01-28 17:30:38 -0500 marked best answer Delay in tf broadcaster robot_pose_ekf

I'm trying to run Turtlebot using a fixed Kinect in the room, but I have some Tf issues in rviz. I cannot visualize robot Tf after base_footprint but I can see the link by:

$rosrun tf view_frames

image description

$rosrun tf tf_echo /odom /base_footprint

> At time 1369922387.939
> - Translation: [0.000, 0.000, 0.000]
> - Rotation: in Quaternion [0.000, 0.000, 0.000, 1.000]
>             in RPY [0.000, -0.000, 0.000] At time 1369922388.974
> - Translation: [0.023, 0.001, 0.000]
> - Rotation: in Quaternion [0.000, 0.000, 0.070, 0.998]
>             in RPY [0.000, -0.000, 0.140] At time 1369922389.940
> - Translation: [0.023, 0.001, 0.000]
> - Rotation: in Quaternion [0.000, 0.000, 0.070, 0.998]
>             in RPY [0.000, -0.000, 0.140]

I can see some delay issues by tf_monitor: RESULTS: for all Frames

Frames: Frame: /ar_marker published by /pattern_reader Average Delay: 0.00499706 Max Delay: 0.0056474 Frame: /base_footprint published by /robot_pose_ekf Average Delay: 3415.33 Max Delay: 3415.36 Frame: /base_link published by /robot_state_publisher Average Delay: -0.486476 Max Delay: 0 Frame: /base_marker published by /baseMarker Average Delay: 0.000302851 Max Delay: 0.00233504 Frame: /camera_depth_frame published by /robot_state_publisher Average Delay: -0.486468 Max Delay: 0 Frame: /camera_depth_optical_frame published by /robot_state_publisher Average Delay: -0.486467 Max Delay: 0 Frame: /camera_link published by /robot_state_publisher Average Delay: -0.486478 Max Delay: 0 Frame: /camera_rgb_frame published by /robot_state_publisher Average Delay: -0.486466 Max Delay: 0 Frame: /camera_rgb_optical_frame published by /robot_state_publisher Average Delay: -0.486465 Max Delay: 0 Frame: /front_wheel_link published by /robot_state_publisher Average Delay: -0.486464 Max Delay: 0 Frame: /gyro_link published by /robot_state_publisher Average Delay: -0.486463 Max Delay: 0 Frame: /laser published by /robot_state_publisher Average Delay: -0.486462 Max Delay: 0 Frame: /left_cliff_sensor_link published by /robot_state_publisher Average Delay: -0.486475 Max Delay: 0 Frame: /left_wheel_link published by /robot_state_publisher Average Delay: 3414.33 Max Delay: 3414.34 Frame: /leftfront_cliff_sensor_link published by /robot_state_publisher Average Delay: -0.486473 Max Delay: 0 Frame: /odom published by /initialPose Average Delay: 0.000253598 Max Delay: 0.000583442 Frame: /openni_rgb_optical_frame published by /world_kinect_link Average Delay: -0.0998099 Max Delay: 0 Frame: /plate_0_link published by /robot_state_publisher Average Delay: -0.486461 Max Delay: 0 Frame: /plate_1_link published by /robot_state_publisher Average Delay: -0.48646 Max Delay: 0 Frame: /plate_2_link published by /robot_state_publisher Average Delay: -0.486459 Max Delay: 0 Frame: /plate_3_link published by /robot_state_publisher Average Delay: -0.486458 Max Delay: 0 Frame: /rear_wheel_link published by /robot_state_publisher Average Delay: -0.486457 Max Delay: 0 Frame: /right_cliff_sensor_link published by /robot_state_publisher Average Delay: -0.486472 Max Delay: 0 Frame: /right_wheel_link published by /robot_state_publisher Average Delay: 3414.33 Max Delay: 3414.34 Frame: /rightfront_cliff_sensor_link published by /robot_state_publisher Average Delay: -0.486471 Max Delay: 0 Frame: /spacer_0_link published by /robot_state_publisher Average Delay: -0.486456 Max Delay: 0 Frame: /spacer_1_link published by /robot_state_publisher Average Delay: -0.486455 Max Delay: 0 Frame: /spacer_2_link published by /robot_state_publisher Average Delay: -0.486454 Max Delay: 0 Frame: /spacer_3_link published by /robot_state_publisher Average Delay: -0.486453 Max Delay: 0 ...

(more)
2014-01-28 17:27:04 -0500 marked best answer Improve quality of octomap with params

Hi,

I have built a octomap with ocotmap_server. It's a map with mobile objects so it's necessary to actualize frequently. Unfortunately some octomap points doesn't remove themselves. So I look for better parameters in the launch file.

I have well set latch like true but anything change. I want modify "sensor_model/hit, sensor_model/miss, sensor_model/min and sensor_model/max". I have seen the topic Octomap_server but I don't understand descriptions of parameters.

what does you mean each of them and How I should modify this parameters to select fewer points ?

2014-01-28 17:26:56 -0500 marked best answer move_arm how ignore his arm to collision

I know my title is not clear but it's difficult to explain.

So I use a wizard description package with a robot arm (Package Arm navigation). I can send a goal(pose and articular) and the robot follow a trajectory I use also a kinect to generate a pointcloud and then a octomap_collision.

My problem it's that the kinect detects the robot like a obstacle and so stops directly. I can see that there are cubes of collision_map in the robot. How can I do to avoid this mistake ? The environment server can understand that this points are the robot ? I think that something exists because it's a very recurrent problem in grasping of object for example.

I don't know if it's more clear. if it isn't, post a question in comment please. Thanks

2014-01-28 17:26:38 -0500 marked best answer Create collision map from Kinect with nodelet

Hi,

I use a camera Kinect for create a collision map but currently it's too slow and use 100% CPU. So I want use nodelets to avoid useless copies of pointCloud and speed up the process. I understand how I should use nodelet with openni_launch because I have a good explaining here: Get start with ROS Nodelet

My question is How create a collision map with nodelet ? Currently I use octomap_server to construct the map but it's not a nodelet.

It's necessary to modify the package ocotomap_server to porting nodes to nodelets ?

Or it's possible to run this node as nodelets with a change in the launch file ?

Or existing other package to construct collision map with nodelets ?

Thanks

EDIT:

I have tried with two computer: (With system Monitor)

The first work with two CPU at 100%: Ubuntu 10.04 / Ros Electric / Memory 3.9 GiB / Two processor Intel Core(TM)2 6600 @ 2.40GHz

The second work well, it have only a CPU at 100%, it have 8 CPU: Unbuntu 12.04 / Ros Electric / Memory 15.7 GiB / Height processor Intel(R)Xeon(R) W3565 @ 3.20 GHz

EDIT 2: New subject on how create this node: Build a octomap_server nodelet

2014-01-28 17:26:29 -0500 marked best answer Add mobil object in collision_map to arm_navigation

Hi everybody,

Situation I use the package arm_navigation with ros-electric to control a arm robot with seven degree of freedom. I can send a articular or Cartesian goal, the robot moves well and avoid the object from urdf file like the table where is the robot. I use the "robot_name"_arm_navigation.launch and a robot simulation which receive joint state y publish TF of urdf with the good position.

Problematic I would use the kinect to add man body in the scene and more particularly at the collision map for the robot can avoid the body. I want to actualize it everytime. I tried three solutions. I want add the body as several cylinders which moves in "real time".

Solution 1: I followed the tutorial Adding known objects to the collision environment to add cylinder from kinect information. In this code I just add in the topic collision_object 10 cylinders to modeling the body.

Results 1: I succeeded to add correctly all cylinder for each body part. For example a arm it's modeling by a cylinder. The robot avoids well the body but just at the beginning of his movement. During the trajectory cylinders doesn't move and I can't check collisions.

Question 1: How actualize the collision map for see cylinders moves ?

Solution 2: It's a similar solution with the tutorial Adding Virtual Objects to the Planning Scene. The difference it's only that I don't use a topic to add objects but a service. I don't see others differences with the solution 1.

Result 2: With this solution cylinders can move everytime. There isn't the same problem of solution 1. But There is a more important problem with it. This solution works only when I choose a coordinate system already present in the urdf and doesn't work with a coodinate system generated by Kinect like right_arm_1. I explain me, when I add a cylinder, I add in a coordinate system(TF) and in this case it's more easy to add cylinder directly in coordinate system of the body. And with right_arm_1 or openni_depth_frame as coordinate system the cylinders doesn't fix in this coordinate system but works perfectly with coordinate system created from urdf file. I added a link between a urdf coordinate system and openni_depth_frame but it's like if there isn't this link.

Question 2: Why the environment_server doesn't understand the new TF of Kinect system ? How fix this ?

Solution 3: We thought to add the body man directly in the urdf and modify directly the tf with a publication in /tf.

Result3: it doesn't work because the simulation publish every time all urdf tf by /robot_st_pub. Only we can modify joint state. And it's more complicated to calculate joint state of the boy.

Question 3: It's possible to use urdf to add and modify mobil object ?

thank you for your help. if I should add details ask me.

EDIT1: Hi, Thanks for ... (more)

2014-01-28 17:26:07 -0500 marked best answer How make planning description wizard with a collada ?

Hi,

I have a collada file for my robot and no a urdf. I want use the planning_description_wizard to use arm navigation after. I know, I can't change my collada in a urdf but it's possible to use the collada for it ?

I want use this tutorial: http://www.ros.org/wiki/arm_navigation/Tutorials/tools/Planning%20Description%20Configuration%20Wizard But this function no works with collada file: $ roslaunch planning_environment planning_description_configuration_wizard.launch urdf_package:=<your urdf="" package="" name=""> urdf_path:=<relative path="">

thanks,

2013-12-11 23:44:49 -0500 received badge  Notable Question (source)
2013-11-18 19:27:28 -0500 received badge  Notable Question (source)
2013-10-17 21:49:54 -0500 received badge  Notable Question (source)