ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Mahe's profile - activity

2021-11-24 02:47:30 -0500 received badge  Notable Question (source)
2021-11-24 02:47:30 -0500 received badge  Popular Question (source)
2021-11-24 02:47:30 -0500 received badge  Famous Question (source)
2021-04-04 10:07:43 -0500 received badge  Necromancer (source)
2020-06-29 07:55:58 -0500 answered a question empty rqt_image_view display

Have you tried clearing the config using rqt_image_view --clear-config ?

2020-02-20 10:47:07 -0500 received badge  Famous Question (source)
2019-05-27 03:52:02 -0500 received badge  Nice Answer (source)
2018-10-31 11:03:47 -0500 received badge  Popular Question (source)
2018-10-31 11:03:47 -0500 received badge  Famous Question (source)
2018-10-31 11:03:47 -0500 received badge  Notable Question (source)
2018-04-21 08:39:25 -0500 asked a question How to find Jacobian using kdl that matches with bullet physics jacobian

How to find Jacobian using kdl that matches with bullet physics jacobian Hi, I am trying to calculate jacobian with kdl

2018-02-14 10:06:13 -0500 commented question Best way to check collisions on trajectory using FCL

@l4ncelot, Did you find a way to do this?. If so, can help me to do the same?. Thank you.

2018-01-17 11:20:39 -0500 edited question How to create a planning scene within fcl using python

How to create a planning scene within fcl Hi, I am trying to work on a custom planner for the robot. I need to create t

2018-01-17 11:20:20 -0500 asked a question How to create a planning scene within fcl using python

How to create a planning scene within fcl Hi, I am trying to work on a custom planner for the robot. I need to create t

2017-05-08 04:58:03 -0500 received badge  Famous Question (source)
2017-01-31 21:07:10 -0500 received badge  Notable Question (source)
2016-10-31 05:46:49 -0500 received badge  Famous Question (source)
2016-10-27 06:04:04 -0500 received badge  Notable Question (source)
2016-10-27 04:14:46 -0500 received badge  Popular Question (source)
2016-10-25 17:20:47 -0500 asked a question how to transform image from one coordinate frame to another coordinate frame using c++

I have two images from two different cameras, for which I have done both intrinsic and extrinsic calibration.

I have done transforming point clouds(3D) from both the cameras to base_link frame and added together(point cloud registration).

I wanted to do the same for 2D images. That is, I want to transform both the images and stitch it together so that I get one large image as if taken from one single camera.

Is my approach correct. If so How would I do it.

Thanks in advance.

2016-10-24 18:39:05 -0500 received badge  Supporter (source)
2016-10-13 15:28:23 -0500 commented answer Detecting Spheres using RANSAC in PCL

could you share whole code what you have done, I am trying here to detect a cube,

2016-10-12 20:57:00 -0500 received badge  Notable Question (source)
2016-10-12 16:02:31 -0500 received badge  Self-Learner (source)
2016-10-12 16:02:31 -0500 received badge  Teacher (source)
2016-10-12 15:35:28 -0500 received badge  Scholar (source)
2016-10-12 15:35:21 -0500 answered a question cant get frame_id of ar_track_alvar pose

I finally sorted it out myself. I need to access req.markers[0].header.frame_id to get frame_id and not req.header.frame_id

2016-09-30 23:40:37 -0500 received badge  Popular Question (source)
2016-09-30 04:07:38 -0500 received badge  Editor (source)
2016-09-30 04:05:39 -0500 commented question cant get frame_id of ar_track_alvar pose

@130s I have updated question

2016-09-29 21:45:42 -0500 received badge  Student (source)
2016-09-29 17:57:28 -0500 asked a question cant get frame_id of ar_track_alvar pose

I am trying to get pose of the camera and set it with respect to the world frame. I want to get the frame_id from msg, so that I can set multiple cameras with respect to world frame dynamically.

I am using Asus Xtion pro live, so I launch ar_track_alvar with pr2_indiv_no_kinect.launch.

This is what I have done,

Launch file,

<launch>
    <arg name="marker_size" default="4.4" />
    <arg name="max_new_marker_error" default="0.08" />
    <arg name="max_track_error" default="0.2" />
    <arg name="cam_image_topic" default="/camera1/rgb/image_rect_color" />
    <arg name="cam_info_topic" default="/camera1/rgb/camera_info" />    
    <arg name="output_frame" default="/camera1_link" />

    <node name="ar_track_alvar" pkg="ar_track_alvar" type="individualMarkersNoKinect" respawn="false" output="screen" args="$(arg marker_size) $(arg max_new_marker_error) $(arg max_track_error) $(arg cam_image_topic) $(arg cam_info_topic) $(arg output_frame)" />
</launch>

My ros Node,

#include <ros/ros.h>
#include <tf/transform_datatypes.h>
#include <ar_track_alvar_msgs/AlvarMarkers.h>
#include<tf/transform_broadcaster.h>
#include <tf/transform_listener.h>
#include<iostream>
#include <string>

void cb(ar_track_alvar_msgs::AlvarMarkers req) {

  tf::TransformBroadcaster tf_br;
  tf::TransformListener listener;
  static  tf::Transform transform;

  if (!req.markers.empty()) {
    tf::Quaternion q(req.markers[0].pose.pose.orientation.x, req.markers[0].pose.pose.orientation.y, req.markers[0].pose.pose.orientation.z, req.markers[0].pose.pose.orientation.w);
    transform.setOrigin( tf::Vector3(ceil(req.markers[0].pose.pose.position.x), ceil(req.markers[0].pose.pose.position.y), ceil(req.markers[0].pose.pose.position.z)) );
    transform.setOrigin( tf::Vector3(req.markers[0].pose.pose.position.x, req.markers[0].pose.pose.position.y, req.markers[0].pose.pose.position.z) );
    transform.setRotation(tf::Quaternion( req.markers[0].pose.pose.orientation.x, req.markers[0].pose.pose.orientation.y, req.markers[0].pose.pose.orientation.z, req.markers[0].pose.pose.orientation.w));

    try{

        // this doesn't prints the frame id.
        ROS_INFO("req.header.frame_id . . . . . .. .  .. ", req.header.frame_id.c_str());

      if(req.header.frame_id.compare("/camera1_link"))
      {
          ROS_INFO("this gets printed . . ");
        // this works . . I mean string comparision returns true.
        // I want to set frame_id to the tf tree.

       //   listener.waitForTransform(req.header.frame_id, "world", ros::Time::now(), ros::Duration(1.0));
      //    tf_br.sendTransform(tf::StampedTransform(transform.inverse(), ros::Time::now(), "world", req.header.frame_id));

      }

    }
    catch (tf::TransformException ex){
      ROS_ERROR("%s",ex.what());
      ros::Duration(1.0).sleep();
    }
  }
}

int main(int argc, char **argv) {
  ros::init(argc, argv, "camera_tf_pose");
  ros::NodeHandle nh;
  ros::Subscriber sub = nh.subscribe("ar_pose_marker", 1, &cb);

  ros::spin();
  return 0;

}

Output of rosrun camera_tf_pose camera_tf_pose

[ INFO] [1475225608.355125575]: req.header.frame_id . . . . . .. .  .. 
[ INFO] [1475225608.355185064]: this gets printed . . 
[ INFO] [1475225608.454772325]: req.header.frame_id . . . . . .. .  .. 
[ INFO] [1475225608.454802236]: this gets printed . . 
[ INFO] [1475225608.555007653]: req.header.frame_id . . . . . .. .  .. 
[ INFO] [1475225608.555137160]: this gets printed . .

Output of rostopic echo /ar_pose_marker

   markers: 
  - 
    header: 
      seq: 0
      stamp: 
        secs: 1475225585
        nsecs: 290621273
      frame_id: /camera1_link
    id: 14
    confidence: 0
    pose: 
      header: 
        seq: 0
        stamp: 
          secs: 0
          nsecs:         0
        frame_id: ''
      pose: 
        position: 
          x: 0.310138838061
          y: -0.0777276835864
          z: -0.00489581265903
        orientation: 
          x: 0.158053463521
          y: -0.431284842866
          z: 0.021097283333
          w: 0.888013170859

when I uncomment the following lines,

 //   listener.waitForTransform(req.header.frame_id ...
(more)
2016-09-29 10:07:33 -0500 received badge  Enthusiast
2016-09-29 10:07:33 -0500 received badge  Enthusiast
2016-09-29 10:07:32 -0500 received badge  Enthusiast
2016-09-12 10:09:23 -0500 received badge  Popular Question (source)
2016-08-27 14:12:05 -0500 asked a question how to set dynamic tf between camera_link and world frame

I am new to ROS and I am trying to set tf between camera_link and world frame dynamically with alvar marker using TransformBroadcaster.

this is the code im trying,

#include <ros/ros.h>
#include <tf/transform_datatypes.h>
#include <ar_track_alvar_msgs/AlvarMarkers.h>
#include<tf/transform_broadcaster.h>
#include <tf/transform_listener.h>
#include<iostream>




void cb(ar_track_alvar_msgs::AlvarMarkers req) {

  tf::TransformBroadcaster tf_br;
  tf::TransformListener listener;
  tf::Transform transform;

  if (!req.markers.empty()) {
    tf::Quaternion q(req.markers[0].pose.pose.orientation.x, req.markers[0].pose.pose.orientation.y,       req.markers[0].pose.pose.orientation.z, req.markers[0].pose.pose.orientation.w);
    tf::Matrix3x3 m(q);
    double roll, pitch, yaw;
    m.getRPY(roll, pitch, yaw);

   // this prints correct pose of the camera with respect to the alvar marker
    std::cout<<req.markers[0].pose.pose;


   // this prints x,y,z 0,0,0 . .not showing correct camera pose
    //  ROS_INFO("x, y, z=%1.2f  %1.2f  %1.2f", req.markers[0].pose.pose.position.x, req.markers[0].pose.pose.position.y, req.markers[0].pose.pose.position.z);


    transform.setOrigin( tf::Vector3(req.markers[0].pose.pose.position.x, req.markers[0].pose.pose.position.y, req.markers[0].pose.pose.position.z) );


      // orientation of the camera works is correct
    transform.setRotation(tf::Quaternion( req.markers[0].pose.pose.orientation.x, req.markers[0].pose.pose.orientation.y, req.markers[0].pose.pose.orientation.z, req.markers[0].pose.pose.orientation.w));

    try{
      listener.waitForTransform("/camera_link", "/world", ros::Time::now(), ros::Duration(1.0));

     // this not setting correct camera pose translation but orientation works
      tf_br.sendTransform(tf::StampedTransform(transform, ros::Time::now(), "world", "camera_link"));
    }
    catch (tf::TransformException ex){
      ROS_ERROR("%s",ex.what());
      ros::Duration(1.0).sleep();
    }


  } // if
}

int main(int argc, char **argv) {
  ros::init(argc, argv, "ar_listener");
  ros::NodeHandle nh;
  //  ROS_INFO("INSIDE main ar_listener . . . . .");

  ros::Subscriber sub = nh.subscribe("ar_pose_marker", 1, cb);
  ros::spin();
  return 0;

}

output of rosrun tf tf_echo /world /camera_link

At time 1472318991.936 - Translation: [0.000, 0.000, 0.000] - Rotation: in Quaternion [1.000, -0.000, -0.000, 0.000] in RPY (radian) [3.142, 0.000, -0.000] in RPY (degree) [180.000, 0.000, -0.000]

Please help me how to achieve what I wanted.

Thanks in advance . .