ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

mbj's profile - activity

2022-09-13 03:14:41 -0500 received badge  Good Question (source)
2021-03-18 10:29:54 -0500 received badge  Great Question (source)
2020-12-17 02:43:52 -0500 received badge  Good Question (source)
2020-06-09 09:32:56 -0500 received badge  Nice Question (source)
2018-12-24 22:39:37 -0500 received badge  Nice Question (source)
2018-01-18 02:19:53 -0500 answered a question Error when creating snaps for ros subscriber/publisher tutorial

I was experiencing the same problem. Is your "src" folder a symbolic link? In that case, check it's an absolute one (e.g

2015-09-18 18:44:40 -0500 marked best answer Impossible to set a PlanningScene

Hi all.

I'm trying to call the "environment_server/set_planning_scene_diff" service to get a PlanningScene to perform some collision checking. However, after a couple of days trying and searching in ROS Answers and more I didn't get any advance. My situation is the same that jbarry describes in this thread but this solution doesn't work in my case, I continue getting this:

[ WARN] [1345114122.403491890, 1345114122.395468950]: Incomplete robot state in setPlanningScene
[ERROR] [1345114122.403650694, 1345114122.395468950]: Setting planning scene state to NULL

I tried to debug manually searching through the different classes involved as Adolfo Rodriguez suggest in this thread (collision_models.cpp, collision_models_interfaces.cpp, model_utils.cpp) but I didn't find anything special except that if I print the joint_names.size() I get 13 joints (I expected 7). I also observed that when I launch the service I get 0 joints and the interactive markers in the RViz stops working.

I have also checked as bit-pirate suggest that /joint_state are correctly published (/world_joint and the 7 joints of my arm).

Can you say me what I'm doing wrong? Thanks in advance!

2015-04-23 03:21:20 -0500 marked best answer Problem calling filter_trajectory service

Hello.

I'm writing my own cartesian controller for my Powerball arm following this tutorials. After a few days working with it I get a correct control and now, my next step is get a correct movement without collisions (in a prefixed environment) and auto-collisions.

As you know, there is wizard in the arm_navigation package that allows to generate automatically the kinematics services and so on related with the arm movement. I executed it and generated the files with success. After a study of the generated files and the packages involved, I concluded that the correct way to avoid the collisions is the use of the "trajectory_filter_server" node. My idea is call this service (trajectory_filter_server/filter_trajectory_with_constraints) every time that I make a position increment, and obtain from this a answer containing if the movement is valid or not.

However, after several test, I haven't be able to make a correct call to the service. Anyone can tell me which are the necessary and essential parameters to correctly launch the service? I "only" filled the stamp, frame_id, trajectory.joint_names, trajectory.points, robot_state.start_state.name, robot_state.start_state.position and the group_name.

[Added 13/08/2012]

When debugging with GDB I observed that when I call the service it fails in the line 84 of the spline_smoother_utils.cpp ( spline_segment = spline.segments.back(); ).

  if(time >= segment_end_time)
  {
    ROS_DEBUG("Did not find spline segment corresponding to input time: %f",time);
    segment_time = segment_end_time - segment_start_time;
    spline_segment = spline.segments.back();
    return true;
  }

I think that all the necessaries services are launched because I get before the next output:

[INFO] [1344842134.839221386, 118.688000000]: Successfully connected to planning scene action server for /trajectory_filter_server

[INFO] [1344842134.846775320, 118.768000000]: waitForService: Service [/trajectory_filter_server/filter_trajectory_with_constraints] is now available.

[INFO] [1344842134.851041043, 118.799000000]: Sending Planning Scene....

[INFO] [1344842134.851441300, 118.813000000]: Reverting planning scene to default.

[INFO] [1344842134.856230667, 118.827000000]: Planning scene sent.

[INFO] [1344842134.859664450, 118.857000000]: Initialized

[INFO] [1344842134.859908300, 118.864000000]: Selecting planning group 0

[INFO] [1344842134.861566525, 118.864000000]: Planning group selected.

P.S: Do you think that my "idea" is correct or exist any node more appropriately for this purpose.

2015-04-09 05:11:32 -0500 received badge  Famous Question (source)
2014-12-17 20:45:14 -0500 received badge  Notable Question (source)
2014-12-17 20:45:14 -0500 received badge  Famous Question (source)
2014-12-17 20:45:14 -0500 received badge  Popular Question (source)
2014-04-20 12:21:44 -0500 marked best answer Applying apply_joint_effort and apply_body_wrench

Hi again.

I'm simulating in Gazebo the R2D2 robot from this tutorial (although there is no link in this page, I dowloaded all the urdf tutorials stacks and there is a model called 07-physics.urdf with the correct physics of the robot).

I'm trying to apply some forces to the joints to interact with the robot using /apply_joint_effort and /apply_body_wrench.

First, I spawn the robot:

rosrun gazebo spawn_model -urdf -file 07-physics.urdf -model r2d2 -z 0.5

Next, I make a shell script that executes the next orders:

rosservice call /gazebo/apply_joint_effort '{joint_name: "left_back_wheel_joint", effort: -0.2, start_time: 0, duration: -1 }'
rosservice call /gazebo/apply_joint_effort '{joint_name: "left_front_wheel_joint", effort: -0.2, start_time: 0, duration: -1 }'

I obtained the following possible answers:

  • The robot turns a little to the right before hitting the ground and go straight. (Most common).
  • The robot is rotating to the right while moving in a "straight" direction.
  • The robot dumps (also quite common xD).

I tried to change the start time (setting to 5, 10...) and the duration (setting to 50000, 500, 50...), but the response is always the same. I think that the correct answer would be that the robot will rotate on itself pivoting on the imaginary axis of the right wheels.

What I doing bad? Thanks!

2014-04-20 12:21:28 -0500 marked best answer No transform from [anything] to [/base_link]

Hi.

I'm trying to learn how to build a visual robot model from scratch with this tutorial.

When I run the first shape it works fine, but when I try to run the others shapes rViz show messages like these:


base_link: Transform OK

box: No transform from [box] to [/base_link]

gripper_pole: No transform from [gripper_pole] to [/base_link]

head: No transform from [head] to [/base_link]

left_back_wheel: No transform from [left_back_wheel] to [/base_link]

...

right_tip: No transform from [right_tip] to [/base_link]

URDF: URDF parsed OK


I have readed all about the topic, but I don't find any solution. This is a problem in the tf? I'm copy-pasting the code of the robots and following step by step the tutorial, where is the problem?

Sorry for my poor english and thanks!

2014-04-20 12:21:28 -0500 marked best answer Could not contact the following nodes: * /joint_state_publisher

Hi there!

I'm having a problem when I launch RVIZ with a join_state_publisher. For exemple, when I do the tutorial from this page (Building Movable Robot Model with URDF) and I close it.

When I execute roswtf, appears that message:

"ERROR: Could not contact the following nodes: * /joint_state_publisher"

If I try to kill them with rosnode kill /joint_state_publisher it seem's that be OK, but if I rerun it I gave the same message.

I think that this is important because in next reruns of RVIZ it doesn't works correctly (the model not appears completly).

Any idea?

2014-04-20 06:55:18 -0500 marked best answer Commanding "MoveIt!" with a pad

Hi all!

After a long time with Fuerte, I decided to migrate to Groovy with the idea of taking advantage with "MoveIt!" and other new packages. I want to point out that, I'm quite happy with the new functionalities, but I don't understand the catkin change...however, maybe this "problem" disappear when I'll get used to. ;)

Regarding my question, I implemented with Fuerte and KDL the Cartesian/Euler control of a 6-degree arm, controllable via keyboard and pad. I got the initial position of the final effector, and I move that with the joystick making little increments and passing them to the IK solver, that returns me each joint position for that final effector pose.

Currently, I'm trying to get the same effect with "MoveIt!" but I don't reach it. I found the "moveit_commander" package, and I tried to use them as interface, passing them little increments in each direction, but the solver is too slow (takes between 0.4-1.3 seconds) and it makes long queues of petitions, failing in the 95% of the chances. Could you give me any idea to get the desired effect?

Thanks in advance!

2014-04-20 06:47:55 -0500 marked best answer Referencing non-ROS packages dependencies

Hi.

I'm trying to reference a non-ROS package through one of my ROS package but I not have found the correct option in the CMakeList to make this correctly. I want to include the .h of this non-ROS package on to my files, but i don't find the method to make visible.

Thanks in advance and sorry for this simple question, but I don't find any answer in the tutorials.

2014-04-20 06:47:46 -0500 marked best answer It is allowed to change the update() method frequency for PR2 controllers?

I found the parameters /pr2_controller_manager/mechanism_statistics_publish_rate and /pr2_controller_manager/joint_state_publish_rate, but I didn't find anything that allows me to change the update() method frequency of a controller.

There is any way to modify this or the update() frequency is necessary 1000Hz?

2014-04-20 06:47:45 -0500 marked best answer Gazebo mimic joints property

Hi.

I'm trying to "link" two joints with the mimic joint property. I readed that this property isn't suported officialy yet but there is in the code of Gazebo and can be used.

I readed here: http://www.ros.org/doc/api/urdf/html/joint_8h_source.html and found some exemples like this:

<joint name="r_gripper_r_finger_joint" type="revolute">
  <axis xyz="0 0 -1"/>
  <origin rpy="0 0 0" xyz="0.07691 -0.01 0"/>
  <limit effort="1000.0" lower="0.0" upper="0.548" velocity="0.5"/>
  <dynamics damping="0.2"/>
  <mimic joint="r_gripper_l_finger_joint" multiplier="1" offset="0"/>
  <parent link="r_gripper_palm_link"/>
  <child link="r_gripper_r_finger_link"/>
</joint>

But when I try to link my joints it has no efect, and when I apply one effort to a "joint_left_wheel" the "joint_right_wheel" (presumably connected) keeps in the position. My code is the next (I also tried with the next parameters joint_name="joint_left_wheel" multiplier="1" offset="0"), does you see any mistake?

<joint name="joint_right_wheel" type="continuous">
  <origin xyz="0.35 0 0.1" rpy="0 0 0" />
  <axis xyz="0 0 0" />
  <mimic joint="joint_left_wheel" multiplier="1" offset="0"/>
  <parent link="chassis"/>
  <child link="right_wheel"/>
</joint>

Thanks!

2014-04-20 06:47:43 -0500 marked best answer Transform from base_link to /map navigation stack error

Hi!

I'm installing the navigation stack in my own simulated robot, but when I try to launch I gave the next problem:

"Waiting on transform from base_link to /map to become available before running costmap, tf error: Frame id /base_link does not exist!"

Searching a little I found that normally the transform broadcaster of one robot publish 3 tf (odom->base_link_offset base_link_offset->base_link base_link_ofsset->baser_laser_link) but I'm not sure for what is exactly each one and what are the necessary for make the navigation stack run. By other hand, my robot only publish one transform trought the chassis->base_laser_link (as explained in this tutorial), so I think that there is the problem.

Can anyone help me?

2014-04-20 06:47:41 -0500 marked best answer Obstacles on Rviz, incorrectly visualization.

Hi everywhere.

I'm finishing the install of the navigation stack on my robot, but I have a problem. As I viewed in the tutorial "Using rviz in navigation stack", the obstacles and inflated obstacle will appear as the robot are approximating to them and always that the laser can "view" them. If you see my video, you can note that some obstacles on my rviz are "loaded" since the first moment, and the others will not appear although the laser are viewing it (I note that if the robot collision with this "invisible" obstacles, it see this obstacles).

Video: http://youtu.be/rBaU9yuqO6k

I searched and found some topics in forums of people that have a similar problem (altough they directly not see the obstacles), but I try to apply some changes proposed and the result is the same.

There are some of my configuration files:

costmap_common_params.yaml:

obstacle_range: 0.05

raytrace_range: 0.5

footprint:[[0.5, -0.275], [0.5, 0.275],[-0.5,0.275], [-0.5, -0.275]]

inflation_radius: 0.5

observation_sources: laser_scan_sensor laser_scan_sensor: {sensor_frame: /hokuyo_laser_link, data_type: LaserScan, topic: hok_node_topic, marking: true, clearing: true}

global_costmap:
global_frame: /map
robot_base_frame: base_link
update_frequency: 5.0 static_map: true

local_costmap:
global_frame: odom
robot_base_frame: base_link
update_frequency: 5.0
publish_frequency: 2.0 static_map: true rolling_window: true width: 6.0 height: 6.0 resolution: 0.05

Other problem that I have is to synchronize the position of the robot both in Gazebo and Rviz I have to set manually in this last. There is other method?

Any idea? Thanks in advance!

2014-04-20 06:47:38 -0500 marked best answer Robot not following path in Nav. Stack

Hi community. I'm setting up the navigation stack in my own robot. As I see in a REP, the tf tree is the next:

map -> odom -> chassis_footprint -> chassis -> others (wheels, lasers, ...)

Well, the robot it's represented correctly in rViz (for this, first I launch the robot in gazebo, next I launch the nav. stack and finally I launch the rViz) and furthermore if I set a 2D Goal the robot make a correct path, although the robot doesn't follow it.

Correct path

In addition, if I change the fixed frame - target frame in the rViz to anything that isn't the chassis (like /odom, /map, /chassis_footprint) the robot is saw like this:

Incorrect tf

I think that there be a problem in the tf, but rViz not give any error, although I must say that some frames "jumps" in a rare way.

My tf's are the next:

In tf_broadcaster:

broadcaster.sendTransform(tf::StampedTransform(tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.0,0.0,0.0)),ros::Time::now(), "map", "odom"));
broadcaster.sendTransform(tf::StampedTransform(tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.0,0.0,0.0)),ros::Time::now(),"chassis_footprint", "chassis"));
broadcaster.sendTransform(tf::StampedTransform(tf::Transform(tf::Quaternion(0, 0, 0, 1), tf::Vector3(0.12,0.0,0.285)),ros::Time::now(),"chassis", "hokuyo_laser_link"));

In the odometry node:

//since all odometry is 6DOF we'll need a quaternion created from yaw
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(robot_pose_pa_);

//first, we'll publish the transform over tf
geometry_msgs::TransformStamped odom_trans;
odom_trans.header.stamp = current_time;
odom_trans.header.frame_id = "odom";
odom_trans.child_frame_id = "chassis_footprint";

odom_trans.transform.translation.x = robot_pose_px_;
odom_trans.transform.translation.y = robot_pose_py_;
odom_trans.transform.translation.z = 0.0;
odom_trans.transform.rotation = odom_quat;

//send the transform over /tf
odom_broadcaster.sendTransform(odom_trans);

What is happening and what I'm doing incorrectly? I am very new in the tf's and any idea is appreciated. Thanks!

2014-04-20 06:47:30 -0500 marked best answer Rotation error in Gazebo simulation

Hi everywhere. I wrote the following robot model in URDF and then I built a controller to teleoperate with it.

The result, as you can see in the video is not the expected, since they rotate on itself doing strange things (which seems more difficult when the spin is somehow aligned with the axis of coordinates of the world). Any ideas? I tried to change values ​​Mu1, Mu2, Kp, Kd and others like selfCollide with no success. I compared with models such as the Erratic and PR2 and finds no difference to me.

Here is the video that I recorded with the problem.

Thanks!

P.S: I want to add that the result is the same in teleoperation and in applying joint efforts directly to the wheels. With this I want to say that there is not a problem of the controller.

2014-04-20 06:47:27 -0500 marked best answer Build a controller from scratch

Hi everybody!

After making a "stable" URDF of the robot that I want to simulate and make some tests with it in Gazebo, I'm trying to build a controller for teleop with it. Searching in the web I only found manuals and references from the pr2_controller.

It is essential inherit and implement PR2 classes like pr2_controller_manager, pr2_controller_interface...? There is any generic driver base? Actuators are essential?

Can anyone put some order into this chaos and explain briefly the steps in this process?