ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

K7's profile - activity

2023-02-24 03:52:29 -0500 received badge  Favorite Question (source)
2020-02-19 03:54:24 -0500 received badge  Favorite Question (source)
2019-08-28 06:46:29 -0500 received badge  Good Question (source)
2018-10-23 19:19:19 -0500 received badge  Nice Question (source)
2017-09-25 01:02:23 -0500 received badge  Famous Question (source)
2017-06-15 00:37:53 -0500 received badge  Famous Question (source)
2017-04-20 13:24:18 -0500 marked best answer How to add more AR markers to ar_sys?

I was wondering if anyone knows how to add more AR makers and boards in ar_sys: http://wiki.ros.org/ar_sys

Is this possible?

2017-02-17 15:23:14 -0500 marked best answer ethzasl_sensor_fusion only works first time

Strangest thing happened at the end of completing the ethzasl_sensor_fusion getting started tutorial: http://wiki.ros.org/ethzasl_sensor_fu...

I followed the instructions and then ran the launch file. I see the graph working as expected (updated to rqt_plot). I then let the bag file run until it gets to the end. I'm thinking so far so good but then I try to run it again, now it won't work for me.

I have tried reinitialization everything again. I try over and over. The graph is now flat lining. I can't understand why it would only work once when nothing has changed.

The only thing that I can think of that may be different each time, is the time that I initialize the filter. This is always different as I stop the bag file after running it for approximately 1 second and then initialize as per instructions.

I either get an error warning:

=== ERROR === update: NAN at index 0

=== ERROR === prediction p: NAN at index 0

Or I get this message:

[ WARN] [1441884003.214357239]: large time-gap re-initializing to last state

And sometimes this:

[ WARN] [1441884003.679369485]: Negative scale detected: -0.39217. Correcting to 0.1

It is different depending on where the 'play head' is when I initialize.

I am running Indigo on Ubuntu 14.04.

2016-10-20 08:35:42 -0500 received badge  Taxonomist
2016-09-13 05:22:50 -0500 marked best answer robot_localization tf tree structure

I am having trouble getting robot_localization to work.

I have read through the robot_localization wiki, the tf wiki and the REP-105/103 doc but I still don't understand how to structure the tree for robot_localization.

I have 2 frames_ids from IMU and PTAM messages.

Concepts i don't understand:

  1. Where do these map, odom, base_link frames come from? Do I generate those using tf as a virtual representation of the real world sensors and their relative positions and orientations?
  2. If so how do I link the supplied frames to the tree I create? In the doc it says map -> odom -> base_link. Do I also link base_link -> imu and odom -> ptam?
  3. Finally do I supply robot_localization with the odom and base_link or imu and ptam frames?

The only way I have ever managed to get robot_localization to produce messages in the /odometry/filtered topic is to use the sensor frames directly but this doesn't seem the correct way to do it.

Any example code or step by step instructions would be great. Docs have not helped me with my confusion. I just don't understand how to piece it all together.

2016-08-04 09:07:44 -0500 received badge  Famous Question (source)
2016-07-06 03:57:20 -0500 received badge  Notable Question (source)
2016-06-27 20:32:59 -0500 marked best answer robot_localization how to set up tf

My understanding of a fundamental concept is way off somewhere and was hoping someone could set me straight :)

I am trying to use robot_localization to fuse PTAM+IMU.

I have read through the docs and tutorials for the robot localization package but I am still new to ROS so I am having trouble understanding how tf works with robot_localization.

I have an IMU topic publishing from a pixhawk via mavros: /mavros/imu/data

and I also have a ptam topic: /vslam/pose

Lets say that the orientation of both sensors are aligned with a positional offset on the y of 50cm.

I am guessing that I am now suppose to set up a tf system in code that represents the physical model (with the 50cm offset) and then broadcast the tf system so that robot_localization can use it. Is that correct?

Or am I suppose to use the frame_ids provided by the sensors?

Also if anyone knows of any step by step tutorials for something like this then please let me know. Thanks!

EDIT:

Ok so I tried using the frame_ids from the sensor messages and put those in the launch file for robot_localization. usb_cam is the frame_id from the /vslam/pose and fcu is from /mavros/imu/data. I'm not using a map frame.

<param name="odom_frame" value="usb_cam"/>

<param name="base_link_frame" value="fcu"/>

<param name="world_frame" value="usb_cam"/>

Now robot_localization publishes to the /odometry/filtered topic. When I view the tf tree on rviz it doesn't look right but I am thinking that I have not aligned the axes right?

I've been trying to get this right but still not sure if this is even the right way to use robot_localization?!?!

2016-06-27 08:10:29 -0500 received badge  Popular Question (source)
2016-05-29 20:27:53 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

Sorry, that's about as much as I can help. If I had a Kinect v2 then I could trouble shoot. Been meaning to get one, which is why I am curious to see how this went. ROSkinect has not yet confirmed beyond doubt that freenect v1 works with Kinect v2. If you could confirm this then it would help.

2016-05-29 06:59:44 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

Should? Have you tested with both?

2016-05-29 05:12:45 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

Is this with Kinect v2?

2016-05-29 04:19:12 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

What do you mean by not running perfect? If you have not calibrated your Kinect you may get some warnings but it should still work. Are you able to see point cloud in rviz?

2016-05-29 03:47:30 -0500 answered a question rosrun rviz rviz segment fault

If you look at the install section for Indigo Arm you'll see at the end (3.11) notes for rviz: http://wiki.ros.org/indigo/Installation/UbuntuARM

2016-05-29 03:01:21 -0500 edited answer Test kinect:No devices connected.... waiting for devices to be connected

What do you see when you type lsusb into the terminal? You should see 3 Microsoft Corp devices in the list (Kinect v1):

Microsoft Corp. Xbox NUI Camera

Microsoft Corp. Xbox NUI Motor

Microsoft Corp. Xbox NUI Audio

If it is there then try: $ roslaunch freenect_launch freenect.launch depth_registration:=true

Not sure with Kinect v2. But have a look here: https://github.com/code-iai/iai_kinect2

2016-05-29 02:58:59 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

If you have a Kinect v1 try it with that and see if it works.

2016-05-29 02:55:47 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

This may be an issue with Kinect v2. I have not used it before so I can't help beyond this. I would start by making sure that freenect actually works with Kinect v2. Check if the lib supports it.

2016-05-28 23:29:12 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

After typing rostopic list do you see any topics under /camera? There should be a whole bunch of them (ie /camera/depth_registered/points)

2016-05-28 23:27:06 -0500 commented answer Test kinect:No devices connected.... waiting for devices to be connected

Looks like it might be there although it is strange that they do not say camera or audio after Microsoft Corp. I have not used the Kinect 2. It could be another device (like a mouse).

Are you sure freenect is not publishing any topics? Type $ rostopic list (after running freenect)