ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Karan's profile - activity

2022-02-23 13:35:39 -0500 received badge  Good Question (source)
2021-05-02 09:15:55 -0500 received badge  Nice Question (source)
2017-04-12 22:15:39 -0500 received badge  Good Question (source)
2017-03-30 03:23:12 -0500 received badge  Nice Answer (source)
2014-04-20 06:51:27 -0500 marked best answer Do kinect topics depend on drivers

I happened to have downloaded a bag file of Kinect which was recorded with the help of libfreenect driver. If i develop a node which is for kinect and that uses Openni driver. Will there be differences in the type of topics that are created by both drivers in terms of images and point clouds?

2014-01-29 02:44:00 -0500 received badge  Famous Question (source)
2014-01-28 17:30:21 -0500 marked best answer Problem with coordinate axis of rviz_imu_plugin

I have been trying to use the imu_filter_tools with my Arduino Pilot system. I linked the raw feed of the ardupilot to input of the imu_filter_madgwick package. I dont use the magnetometer data from the Ardupilot.

Here is the Launch file of the Imu filter settings

<node pkg="imu_filter_madgwick" type="imu_filter_node" name="imu_filter_node"> 
    <!-- Input data settings-->

    <remap from="imu/data_raw" to="/Imu" />
    <param name="use_mag" value="false"/>
    <param name="fixed_frame" value="/base_link"/>
    <param name="gain" value="0.1"/>
</node>

After this i try to visualize the orientation of the IMU system in Rviz. I set the fixed frame to be "/base_footprint" in Rviz.

After the IMU's Axis have converged on the stable value i move the IMU around and see the corresponding changes in the rviz.

Although the direction of changing in the display of rviz correspond to the actual changes in the IMU orientation, the axis do not.

Which means that if I rotate the Imu towards left the Rviz shows the same according to every Axis, be it X,Y or Z.

The specific problem is

1.The Z-axis points downwards. Which should face upwards. The acceleration vector which corresponds to the 'g' points upwards which again should be opposite.

2.According to the standards the Roll, Pitch and Yaw increase in the clockwise direction but in Rviz the filter outputs Yaw and Roll which increase in Anti-clockwise direction. The Yaw can be explained since the Z-axis points in the opposite direction but what about the Roll.?

How can i correct this problem of Inverted axis and inverted euler Angles.? Any help would be appreciated..

--EDIT--

@tfoote I tried the thing that you suggested. What i did was to create another coordinate frame (/base_footprint).So the complete tree becomes /base_fooprint -> /base_link -> /IMU

I published the /tf between /base_footprint->/base_link to be (/base_footprint->/IMU)*(/base_link->/IMU).inverse().

The transform /base_footprint->/IMU was calculated by adding pi(3.14) to the Roll information of the /base_link->/IMU transform to invert Z-axis.

If there is a better solution please let me know.

2014-01-28 17:30:16 -0500 marked best answer Rgbdslam crahes with Exit code -6

Hey i am using the GUI version of the rgbdslam. It worked fine earlier but now i am getting a wierd problem .

Whenever i start the node it crashes and gives out the following error.

    Cannot mix incompatible Qt library (version 0x40801) with this library (version 0x40804)
    ================================================================================
REQUIRED process [rgbdslam-1] has died!
    process has died [pid 3190, exit code -6, cmd /home/karan/ros_workspace/rgbdslam_freiburg/rgbdslam/bin/rgbdslam __name:=rgbdslam __log:=/home/karan/.ros/log/14bc0f78-aca3-11e2-bff4-8ca9824e2ee4/rgbdslam-1.log].
    log file: /home/karan/.ros/log/14bc0f78-aca3-11e2-bff4-8ca9824e2ee4/rgbdslam-1*.log
    Initiating shutdown!
    ================================================================================

The headless launch file works fine and i am able to register frames via service calls. So the problem is only GUI related.

It kind of is a problem of the Qt library having 2 versions installed on my system. How can i solve this problem. Any help would be apprecaited.

2014-01-28 17:30:12 -0500 marked best answer Error in writing a Client for RGB-Dslam

I am currently working on a project which requires me to use the RGB-D slam without a GUI. So i have to use the ros service calls for the various operations to be done by the node.

Now since the rgbdslam has to be used in sync with other nodes i can't always use the terminal to make rosservice commands.

So instead i am writing a client for the Rgbd-slam service. Here's the code which produces an error. Whenever i run the node below the service is not called and there is an error as in the ELSE loop.

  ros::init(argc, argv, "rgbdslam_client");
  ros::NodeHandle n;
  ros::ServiceClient client = n.serviceClient<rgbdslam::rgbdslam_ros_ui>("rgbdslam_ros_ui");
  rgbdslam::rgbdslam_ros_ui srv;
    srv.request.command = "frame";


  if (client.call(srv))
  {
    ROS_INFO("Capturing a frame");
  }
  else
  {
    ROS_ERROR("Failed to call service Rgbdslam frame capture ");
    return 1;
  }

  return 0;

But when i use the Rosservice call command in the other terminal it works fine.

What could be a possible solution to the above problem..?

--edit--

@Felix Endres
Had to edit the line to this

ros::ServiceClient client = n.serviceClient<rgbdslam::rgbdslam_ros_ui>("/rgbdslam/ros_ui");

from

ros::ServiceClient client = n.serviceClient<rgbdslam::rgbdslam_ros_ui>("rgbdslam_ros_ui");

And it works fine.

2014-01-28 17:30:09 -0500 marked best answer Wierd TF tree when running RGB-D Slam

Hey, I have rgbdslam package installed which was released during end of February 2013.

I am running the Rgbdslam node using the following parameters for the frames.

<param name="config/fixed_frame_name"      value="/map"/>
<param name="config/base_frame_name"       value="/base_footprint"/>

Then i send the model from the GUI of the RGBDslam to the octomap server using the CTRL+M command. The Octomap server has the following params for the frames.

<param name="frame_id" type="string" value="/map" />
<param name="base_frame_id" value="/camera_rgb_optical_frame" />

Meanwhile in another terminal i use the command "rosrun tf view_frames".

The frame tree that is seen is displayed below.image description

As seen in the above screen shot there is a loop present in the tf tree which should not be there.

What could be a possible solution to the above problem.? Any help would be much appreciated.

--EDIT-- @tfoote I was able to write a tf broadcaster which would imitate the faulty tf tree. Here is a screenshot of the imitated faulty tree. image description

The three code snippets that were used were

      broadcaster.sendTransform(
        tf::StampedTransform(
          tf::Transform(odom_quat, tf::Vector3(0.0, 0.0, 0.0)),
          ros::Time::now(),"base_footprint", "map"));
      r.sleep();

      broadcaster.sendTransform(
        tf::StampedTransform(
          tf::Transform(odom_quat, tf::Vector3(0.0, 0.0, 0.0)),
          ros::Time::now(),"map", "base_footprint"));
      r.sleep();

      broadcaster.sendTransform(
        tf::StampedTransform(
          tf::Transform(odom_quat, tf::Vector3(0.0, 1.0, 2.0)),
          ros::Time::now(),"map", "where_mocap_should_be"));
      r.sleep();
2014-01-28 17:29:03 -0500 marked best answer Using robot_pose_ekf and RGBDslam for path planning with Octomaps

I wanted to use Robot_pose_ekf and RGB-D slam for path planning with Octomaps. I have tried to integrate the robot_pose_ekf with the rgbdslam package. I have added the following to the launch file.

  <node pkg="rgbdslam" type="rgbdslam" name="rgbdslam" cwd="node" required="true" output="screen"> 
    <param name="config/fixed_frame_name"      value="/map"/>
    <param name="config/base_frame_name"       value="/odom_combined"/> 
    <param name="config/fixed_camera"          value="false"/>
  </node>
  <node pkg="robot_pose_ekf" type="robot_pose_ekf" name="robot_pose_ekf">
    <param name="output_frame" value="/odom_combined"/>
    <param name="freq" value="300.0"/>
    <param name="sensor_timeout" value="1.0"/>  
    <param name="odom_used" value="false"/>
    <param name="imu_used" value="true"/>
    <param name="vo_used" value="true"/>
      <remap from="imu_data" to="Imu" />
  </node>

The above launch file gives out the following tf tree tf tree

Also the output of the rgbdslam node at the console is

[ WARN] [1360005388.569379127, 1359982581.866228234]: Frame id /odom does not exist! Frames (10): Frame /camera_link exists with parent /base_footprint.
Frame /base_footprint exists with parent /odom_combined.
Frame /camera_rgb_optical_frame exists with parent /camera_rgb_frame.
Frame /camera_rgb_frame exists with parent /camera_link.
Frame /camera_depth_optical_frame exists with parent /camera_depth_frame.
Frame /camera_depth_frame exists with parent /camera_link.
Frame /odom_combined exists with parent /map.
Frame /map exists with parent /odom_combined.
Frame /where_mocap_should_be exists with parent /map.
 - No odometry available

How should I correct this problem ?

Further whenever I send the model to the octomap server only then the /tf between /map to /odom_combined is available. How can I make it online so that i dont have to use the service provided by the rgbdslam resulting in a fixed publishing rate of the /tf. ?? This should also be realtime in the sense that it should not slow down as the time passes. Any suggestions on what could be a solution to this particular problem?

--------------Edit--------------

I just checked the configuration of the RGBdslam in the gui where all the params were set. I saw there were 3 frames which we could specify.

  1. Base frame

  2. Fixed frame

  3. Odom frame.

Could I initialize these frames with /base_footprint , /map and /odom_combined respectively to get a consistent tf tree and a good map estimated from the input images..?

2014-01-28 17:28:52 -0500 marked best answer robot_pose_ekf coordinate frame

I have a visual odometry source which outputs Ros odometry msg in a coordinate system of the camera which is z-axis forward. Now i also have an Imu based odometry which gives data according to coordinate frame of the world which is x-axis forward.
If i input these two sources in the robot_pose_ekf package, the pose published from the package would be according to which coordinate system?

If this could cause a problem should i convert the visual odometry to the world's coordinate system with x-axis forward terminology.? Also what about the related published tf..?

2014-01-28 17:28:51 -0500 marked best answer How to calculate covariance of visual odometry and Imu data

I have a visual odometry source which is derived from Kinect using Fovis with Imu data coming from Arduino. I combined the two sources using robot_pose_ekf. Now when I view the data in Rviz according to the odom_combined frame I see significant changes when I change the values of covariances in the two data sources. The changes which are seen are that when covariance values are changed the tf computed between odom_combined and base_footprint vary significantly in terms of drift over particular time.

So how can I calculate the values of covariances or just the variances along the diagonal to get sufficient accuracy in the odom_combined frame.?

2014-01-28 17:28:48 -0500 marked best answer Tf from optical frame to odometry frame

I am acquiring point cloud and image data from Kinect sensor. The coordinate frame for the data acquired is:-

X axis--> right

Y axis--> downwards

Z axis--> forward

and the odometry(standard /map axis) has coordinate axis as such:-

X axis--> forward

Y axis--> left

Z axis--> upwards

What would be a static transform between the two frames ??

2014-01-28 17:28:43 -0500 marked best answer Use of Kinect as a Stereo camera for Visual Odometry

I wanted to use my Kinect Camera with a stereo vision node which estimates visual Odometry. The package is viso2_ros link text . I use openni_launcher package to get Kinect data. According to Openni_launch package's documentation link text i can use kinect data for stereo nodes(under IR projector paragraph in above link( Fuerte Tab))..

Now the problem is that what should i take as left camera and what as right camera..? And also accordingly what input images..

I have tried various frames but i always get.."[ WARN] [1358249627.907026303]: Visual Odometer got lost!"

2014-01-28 17:28:30 -0500 marked best answer Failed rosmake [pcl_ros] on ubuntu 11.10 ros-electric pandaboard

I have been working on Kinect sensor integration on Panda board. Because i am using panda board i removed "pcl_visualization" from manifest.xml in pcl directory in perception_pcl stack in Ros.

When i rosmake the pcl_ros package i get the following error..

Building CXX object src/tools/CMakeFiles/convert_pcd_to_image.dir/convert_pcd_to_image.o make[3]: Leaving directory /home/panda/ros/perception_pcl/pcl_ros/build' [ 29%] Building CXX object CMakeFiles/pcl_ros_tf.dir/src/pcl_ros/transforms.o Building CXX object CMakeFiles/pcl_ros_surface.dir/src/pcl_ros/surface/convex_hull.o make[3]: Entering directory/home/panda/ros/perception_pcl/pcl_ros/build' [ 31%] Building CXX object CMakeFiles/pcl_ros_features.dir/src/pcl_ros/features/feature.o /home/panda/ros/perception_pcl/pcl_ros/src/pcl_ros/features/feature.cpp:50:23:fatal error: pcl/io/io.h: No such file or directory compilation terminated. make[3]: * [CMakeFiles/pcl_ros_features.dir/src/pcl_ros/features/feature.o] Error 1 make[3]: Leaving directory /home/panda/ros/perception_pcl/pcl_ros/build' make[2]: *** [CMakeFiles/pcl_ros_features.dir/all] Error 2 make[2]: *** Waiting for unfinished jobs.... [ 33%] Building CXX object CMakeFiles/pcl_ros_surface.dir/src/pcl_ros/surface/moving_least_squares.o <b> /home/panda/ros/perception_pcl/pcl_ros/src/pcl_ros/transforms.cpp:38:23: fatal error: pcl/io/io.h: No such file or directory</b> compilation terminated. In file included from /home/panda/ros/perception_pcl/pcl_ros/include/pcl_ros/surface/convex_hull.h:41:0, from /home/panda/ros/perception_pcl/pcl_ros/src/pcl_ros/surface/convex_hull.cpp:39: <b> /home/panda/ros/perception_pcl/pcl_ros/include/pcl_ros/pcl_nodelet.h:51:29: fatal error: pcl/point_types.h: No such file or directory</b> compilation terminated. In file included from /home/panda/ros/perception_pcl/pcl_ros/include/pcl_ros/surface/moving_least_squares.h:41:0, from /home/panda/ros/perception_pcl/pcl_ros/src/pcl_ros/surface/moving_least_squares.cpp:39: <b>/home/panda/ros/perception_pcl/pcl_ros/include/pcl_ros/pcl_nodelet.h:51:29: fatal error: pcl/point_types.h: No such file or directory</b> compilation terminated. /home/panda/ros/perception_pcl/pcl_ros/src/tools/convert_pcd_to_image.cpp:51:23: fatal error: pcl/io/io.h: No such file or directory compilation terminated. make[3]: *** [CMakeFiles/pcl_ros_surface.dir/src/pcl_ros/surface/convex_hull.o] Error 1 make[3]: *** Waiting for unfinished jobs.... make[3]: *** [CMakeFiles/pcl_ros_tf.dir/src/pcl_ros/transforms.o] Error 1 make[3]: Leaving directory/home/panda/ros/perception_pcl/pcl_ros/build' make[2]: [CMakeFiles/pcl_ros_tf.dir/all] Error 2 make[3]: [CMakeFiles/pcl_ros_surface.dir/src/pcl_ros/surface/moving_least_squares.o] Error 1 make[3]: Leaving directory /home/panda/ros/perception_pcl/pcl_ros/build' make[2]: *** [CMakeFiles/pcl_ros_surface.dir/all] Error 2 make[3]: *** [src/tools/CMakeFiles/convert_pcd_to_image.dir/convert_pcd_to_image.o] Error 1 make[3]: Leaving directory/home/panda/ros/perception_pcl/pcl_ros/build' make[2]: * [src/tools/CMakeFiles/convert_pcd_to_image.dir/all] Error 2 make[2]: Leaving directory /home/panda/ros/perception_pcl/pcl_ros/build' make[1]: *** [all] Error 2 make[1]: Leaving directory/home/panda/ros/perception_pcl/pcl_ros/build' -------------------------------------------------------------------------------} [ rosmake ] Output from build of package pcl_ros written to: [ rosmake ] /home/panda/.ros/rosmake/rosmake_output-20121220-190754/pcl_ros/build_output.log [rosmake-0] Finished <<< pcl_ros [FAIL] [ 63.39 seconds ]
[ rosmake ] Halting due to failure in package ...
(more)

2014-01-28 17:28:21 -0500 marked best answer Tf frames of a mobile robot

I am working on mobile robot which has kinect mounted on it with an IMU unit.

I am confused about the frames of various types..Considering the above example please tell me what are the following frames. 1. Map frame 2. Odom frame 3. Base frame.

i know that the transformation between the kinect and base frame would be equal to the vector which should be added to base frame to get kinect frame...

Now i dont understand is what is Odom frame and what is the /tf between kinect and odom frame and how is it generated.. is it IMU readings or the Slam output pose for instance as in Gmapping.

2014-01-28 17:26:57 -0500 marked best answer Kinect crashes on invoking command "rosbag record -a"

I initially had a problem of kinect not starting up again from the following command "roslaunch openni_launch openni.launch" after it had crashed once.

To that problem i found a solution to kill XnSensorServer first and then use the above command again. The problem solved without any conflicts.

Now the weird error that has cropped up is that when i use "rosbag record -a" (when all images supplied by kinect are viewed in Image_viewer) the images hang and kinect driver crashes producing the following error.

Error [1344683523.942576821]: Cannot stream RGB and IR at the same time. Streaming RGB only. OpenCV Error: Image step is wrong () in cvInitMatHeader, file /tmp/buildd/ros-fuerte-opencv2-2.4.2-0precise-20120704-1846/modules/core/src/array.cpp, line 162 terminate called after throwing an instance of 'cv::Exception' what(): /tmp/buildd/ros-fuerte-opencv2-2.4.2-0precise-20120704-1846/modules/core/src/array.cpp:162: error: (-13) in function cvInitMatHeader

What can be the possible solution.?? Should i record a subset recording only Rgb or Ir at a time ?? Does the solution talked about in start of msg has to do something with this?? (XnSensorServer)