ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

scarlett's profile - activity

2018-06-20 02:39:27 -0500 received badge  Nice Answer (source)
2016-01-18 05:26:32 -0500 received badge  Famous Question (source)
2015-08-23 03:31:11 -0500 received badge  Famous Question (source)
2015-08-22 18:40:13 -0500 answered a question Apply force to a joint using plugins

Hi ! I just made a plugin so I might be able to help! I supposed you saw this link : Gazebo_plugin_tutorial the way i would do it is to call a function from the gazebo API in the function onUpdate() instead of a service.

re use the same class body as in the tutorial gazebo, but add it the function OnUpdate, which is called anytime gazebo update.

The plugin I did was to control the angle of the joints but I found the function setForce in the API which should be what you are looking for.

#include <gazebo/gazebo.hh>

    namespace gazebo
    {
      class WorldPluginTutorial : public WorldPlugin
      {
        public: WorldPluginTutorial() : WorldPlugin()
                {
                  printf("Hello World!\n");
                  gazebo::physics::ModelPtr this->model ;
                }

        public: void Load(physics::WorldPtr _world, sdf::ElementPtr _sdf)
                {
                    this->model = world->GetModel("my_robot");
                }
        public : void OnUpdate(const common::UpdateInfo & /*_info*/)
     {
      // Apply a small linear velocity to the model.
      //this->model->SetLinearVel(math::Vector3(0, 0, 0)); // just in case needed

      // pose of the footprint
      gazebo::math::Pose mapose(this->vect3_pose_model, this->rot_pose_model);
      this->model->SetLinkWorldPose(mapose,"WheelFR_link");
    //pose of every joint 
     this.vecteur_str = ["head","base","joint3","whatever"];
      for (int i = 0 ; i < this->vecteur_str.size(); i++)
      {
         //what I used to control the joints 
        //this->model->GetJoint(vecteur_str[i])->SetAngle(0, vecteur_effort[i]);  
        // This is what you want
        this->model->GetJoint(vecteur_str[i])->SetForce(unsigned int _index, double _effort) 
      }
    }
  };
  GZ_REGISTER_WORLD_PLUGIN(WorldPluginTutorial)
}

If you need the "subscriber/publisher" functions of ROS, just instantiates them in the onLoad() as in any other ROS node.

PS: maybe you'll have more answers on answers.gazebo

2015-08-20 11:04:03 -0500 received badge  Famous Question (source)
2015-07-29 08:19:20 -0500 received badge  Nice Question (source)
2015-07-15 13:52:36 -0500 received badge  Notable Question (source)
2015-06-02 03:20:25 -0500 commented answer broadcast single tf message

I made those publish and listener tf in a function which is called in a while loop aaand it works ! Thanks, I missed the importance of the "repeatedly" part while doing the tutorials !

2015-06-02 02:52:13 -0500 commented answer broadcast single tf message

OK so the problem must be because my function is called by a service so the broadcaster.sendTransform are not in a loop but called a single time.., I will try to improve that !

2015-06-01 05:38:37 -0500 received badge  Notable Question (source)
2015-05-31 16:25:58 -0500 received badge  Popular Question (source)
2015-05-29 05:00:52 -0500 asked a question broadcast single tf message

Hello! I am working on an extrenal camera tracking a robot.

To initialize the tf between the robot and a marker. I put the robot in a known place and with the position of where the mark is detected, I deduce the relation between the mark and the robot.

Fisrt,I create the frame of where I put my robot_foot in my map:

       self.broadcaster.sendTransform(
            (req.x, req.y, 0), quaternion, rospy.Time.now(),
            "/mon_tf/" + ROBOT_FOOT, MAP)

Then I create the frame of where the head should be in my map :

 self.broadcaster.sendTransform(
            trans_foot_to_body, rot_foot_to_body, rospy.Time.now(),
            "/my_tf/" + req.robotpart,
            "/my_tf/" + ROBOT_FOOT)

Now that the supposed position of the head is known, and in the same branch of our detected marker, we can deduce the relation between the head and the mark :

        (trans_marker_to_body,
            rot_marker_to_body) = self.listener.lookupTransform(
            marker, "/my_tf/" + req.robotpart,
            rospy.Time(0))

I got the error:

Lookup would require extrapolation at time 1432892532.412094116, but only time 1432892532.912867069 is in the buffer, when looking up transform from frame [mon_tf/HeadTouchFront_frame] to frame [ar_marker_2]

I have found some similar questions on ros.answers, so I tried to switch rospy.Time.now() by rospy.Time(0) or to use waitForTransform rather than lookupTransform, and also to put some time.sleep(0.5) because it seems to solve the problem in another similar part of my code...

I would like to know if I am using the wrong tool for what I want to do ( because I am just broadcasting a single message) , or how to solve this tf buffer problem? What about tf2?

2015-05-14 11:44:00 -0500 received badge  Critic (source)
2015-05-14 11:36:34 -0500 answered a question Axis Camera. Raw Image

Hello! This is the launchfile that I use, hope it will helps :

<node name="axis_connection" pkg="axis_camera" type="axis.py"  required="true">
    <param name="hostname" value="127.0.0.7" />
    <param name="password" value="Mypasssword" />
    <param name="width" value="1280" />
    <param name="height" value="720" />
    <param name="camera_info_url" value="file://$(find axis_cam_modified)/camera_info/10_0_161_201.yaml" />
</node>

<node name="image_transport_decompressed" pkg="image_transport" type="republish" 
args="compressed in:=image_raw raw out:=image_raw //image_transport:=compressed">
</node>
2015-05-11 07:05:40 -0500 received badge  Popular Question (source)
2015-05-05 11:17:01 -0500 asked a question rviz don't follow the tf configuration of the .rviz file

Hello! My problem is that I want to display some of the TF and hide others. I can do it by selecting what I want by hand, but if I want to save it in a .rviz file and launching it with rosrun rviz rviz -d my_config.rviz, it display all my tf.

The .rviz file is well saved, here is an abstract of the /TF class in the my_config.rviz :

Visualization Manager:
Class: ""
  Displays:
    - Alpha: 0.5
      Cell Size: 1
      Class: rviz/Grid
      Color: 160; 160; 164
      Enabled: true
      Line Style:
        Line Width: 0.03
        Value: Lines
      Name: Grid
      Normal Cell Count: 0
      Offset:
        X: 0
        Y: 0
        Z: 0
      Plane: XY
      Plane Cell Count: 10
      Reference Frame: <Fixed Frame>
      Value: true
    - Class: rviz/TF
      Enabled: true
      Frame Timeout: 15
      Frames:
        All Enabled: false

            WheelFL_link:
              Value: false
            WheelFR_link:
              Value: false
            ar_marker_1:
              Value: true

"my_config.rviz" is the right one loaded ( I tried to add and remove a display type to be sure ), and I am using the rviz version 1.11.7 (indigo), Compiled against OGRE version 1.8.1 (Byatis).

2015-05-04 04:33:28 -0500 received badge  Notable Question (source)
2015-05-04 03:31:06 -0500 answered a question unwanted rotation of the markers detected with ar_track_alvar

As Mehdi told me in the comment, the problem was simply a calibration file not good enough. Thank you Mehdi !

2015-05-01 06:02:45 -0500 answered a question Gmapping and TF frames

It seems that the base_frame choosen to launch gmapping might not be the good one. How do you launch gmapping? In the source code of gmapping:

  v.setValue(0, 0, 1 + laser_pose.getOrigin().z());
  tf::Stamped<tf::Vector3> up(v, scan.header.stamp,
                                      base_frame_);
      [...]

  // gmapping doesnt take roll or pitch into account. So check for correct sensor alignment.
  if (fabs(fabs(up.z()) - 1) > 0.001)
  {
    ROS_WARN("Laser has to be mounted planar! Z-coordinate has to be 1 or -1, but gave: %.5f",
                 up.z());
    return false;
  }

Where laser_pose is the tf between the frame "base_frame" ( which is "base_link" by default ) and the frame of your laser

2015-05-01 05:01:38 -0500 received badge  Autobiographer
2015-05-01 04:51:05 -0500 commented question unwanted rotation of the markers detected with ar_track_alvar

I am trying to improve camera calibrations, it might solve the problem, I will let you know !

2015-04-28 11:26:25 -0500 received badge  Popular Question (source)
2015-04-27 04:18:46 -0500 asked a question unwanted rotation of the markers detected with ar_track_alvar

hello! I discovered something weird with ar_track_alvar but maybe I just forgot something when launching it. I put a mark on the head of a robot and another one one the ground, this is when it is working fine: OK

but if i make the robot turn a little bit mor to the left, a PI rotation relative to the green axe seems to be made :

NOK

I set up a static_transform_publisher between "/axis_camera" and "/map, then I set the "output_frame" argument of ar_track_alvar to "/axis_camera" and the rest of the launchfile is identical to "pr2_indiv_no_kinect.launch".

The problem is not happening with the mark on the ground if i make it turn.

If it can give a clue, here is the rotation data of the mark when my robot make a complete turn: image description

I am grateful for any help!

2015-04-24 09:44:50 -0500 commented answer make a human-computer interaction interface

mm maybe this one?

rqt_launch

You will need to make launchfiles of your nodes then

2015-04-24 08:34:51 -0500 answered a question make a human-computer interaction interface

If you are looking for something like this :

dynamic_reconfigure.png

this tutorial will help you:

http://wiki.ros.org/dynamic_reconfigu...

it creates a server that will publish in a topic if a change is entered in rqt

2015-04-19 16:00:51 -0500 answered a question gmapping localization information

I don't know if you saw this question: [ http://answers.ros.org/question/60523... ], maybe it can help.

Also I used an exploration node working with gmapping combined with hector_trajectory_server to keep the path of my robot, it was efficient to me but depends on what you want, here was my launchfile:

<launch>    
  <!-- dynamic map generation -->
  <node name="gmapping_node" pkg="gmapping" type="slam_gmapping" respawn="false" >
    <remap to="base_scan" from="scan"/>
    <param name="odom_frame" value="odom_combined" />
    <param name="map_update_interval" value="1" />
  </node>
 <!-- path --> 
  <node pkg="hector_trajectory_server" type="hector_trajectory_server" name="hector_trajectory_server" output="screen">
    <param name="target_frame_name" type="string" value="/map" />
    <param name="source_frame_name" type="string" value="/base_link" />
    <param name="trajectory_update_rate" type="double" value="0.5" />
    <param name="trajectory_publish_rate" type="double" value="0.15" />
  </node> 
</launch>
2015-04-19 15:20:48 -0500 answered a question how to set-up /tf_static ?

I usually use this command in a launchfile :

<launch>
      <node pkg="tf" type="static_transform_publisher" 
            name="my_tf" args="0 0 1 3.14 0 1.57 /map /my_tf 30"/>
</launch>

the args are the translation for the 3 first arguments and the angles ( in Radian in an Euler form ) for the 3 last ones.

I hope this will help you

2015-04-14 04:30:38 -0500 marked best answer can't locate node [naoqi_joint_states_cpp]

Hello, we are trying to run nao_driver on our remote machine, connecting to your real Nao with :

NAO_IP=192.168.1.10 roslaunch nao_driver nao_driver.launch ( as said in http://wiki.ros.org/naoqi_driver )

but we get :

ERROR: cannot launch node of type [naoqi_driver/naoqi_joint_states_cpp]: can't locate node [naoqi_joint_states_cpp] in package [naoqi_driver]

we tried to update/upgrade but nothing changed. We are working under indigo and naoqi1.14 and we setup the PYTHONPATH to the pynaoqi path.

What can I try ?

2015-04-14 04:30:15 -0500 received badge  Famous Question (source)
2015-04-14 04:30:15 -0500 received badge  Famous Question (source)