ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

moody's profile - activity

2019-03-28 10:10:32 -0500 received badge  Famous Question (source)
2017-08-02 08:22:02 -0500 commented answer Duo3d driver on Jetson TK1

Sry for the really late reply: hope you figured it out yourself by now. Else: Downloaded the correct SDK from their webs

2017-07-13 03:13:23 -0500 received badge  Notable Question (source)
2017-07-13 03:13:23 -0500 received badge  Popular Question (source)
2017-04-13 11:56:59 -0500 received badge  Student (source)
2017-04-13 11:56:57 -0500 received badge  Self-Learner (source)
2017-04-13 08:34:34 -0500 asked a question duo3d camera_info

Hello everybody.

I was just wondering if it is normal that the duo3d driver from ROS is not publishing anything in the camera_info or if i am having some trouble with my setup?

2017-04-13 08:32:47 -0500 answered a question Duo3d driver on Jetson TK1

Ok. I Figured it out myself:

the Problem was due to the wrong SDK in the devel folder of my workspace. For some reason the Driver had downlaoded the SDK automatically for the amd64. So the solution was to replace the wrong SDK by the one needed by armhf.

2017-04-07 07:14:17 -0500 asked a question Duo3d driver on Jetson TK1

Hello everybody.

I have looked really long for a solution but i cant figure out what the problem is.

I am trying to install the ros driver package for the duo3d camera http://wiki.ros.org/duo3d-ros-driver on the Jetson tk1 board. But every time i try the catkin_make command i will get an error output with:

...catkin_ws/src/duo3d_driver/lib/libDUO.so: file not recognized: File format not recognized
collect2: error: ld returned 1 exit status
make[2]: *** [...catkin_ws/devel/lib/duo3d_driver/duo3d_driver] Error 1
make[1]: *** [duo3d_driver/CMakeFiles/duo3d_driver.dir/all] Error 2
make: *** [all] Error 2
Invoking "make -j1 -l1" failed

Maybe some of you could help me with this problem :-)

2016-08-02 13:05:34 -0500 received badge  Famous Question (source)
2016-03-22 19:12:05 -0500 received badge  Notable Question (source)
2016-03-15 11:15:18 -0500 received badge  Notable Question (source)
2016-03-15 11:15:18 -0500 received badge  Famous Question (source)
2016-03-15 11:15:18 -0500 received badge  Popular Question (source)
2016-02-12 06:09:52 -0500 received badge  Popular Question (source)
2016-02-11 09:52:31 -0500 asked a question viso2 tf_tree problem

Hello to everybody.

I want to use the mono odometer from vo for Odometry in the robot_localization. The first step which is allready not working is the vo. I always get the following error message:

[ WARN] [1455205411.959462457]: The tf from 'base_link' to 'camera' does not seem to be available, will assume it as identity!

I allready looked for 2 days for a solution but i just found out that i have to use a static what i allready did. I use a static transform for the odom->base_link

my launchfile looks linke this:

<launch>

<!-- **********Run the ROS package image_proc********** -->
    <node ns="camera" pkg="image_proc" type="image_proc" name="image_proc">
    </node>

<!-- **********Run the viso2_ros package********** -->
<node pkg="viso2_ros" type="mono_odometer" name="mono_odometer" output="screen">
    <remap from="image" to="camera/image_rect"/>
    <param name="odom_frame_id" value="odom"/>
    <param name="base_link_frame_id" value="base_link"/>
    <param name="publish_tf" value="true"/>
    <param name="invert_tf" value="false"/>

    <!-- Matcher params -->
    <param name="nms_n" value="3" /><!-- default:3 Minimum distance between maxima in pixels for non-maxima-suppression 2-->
    <param name="nms_tau" value="50" /><!-- default:50 Interest point peakiness threshold 60-->
    <param name="match_binsize" value="50" /><!-- default:50 Matching width/height (affects efficiency only)  60-->
    <param name="match_radius" value="200" /><!-- default:200 Matching radius (du/dv in pixels)  110-->
    <param name="match_disp_tolerance" value="2" /><!-- default:2 dv tolerance for stereo matches (in pixels). 2-->
    <param name="outlier_disp_tolerance" value="5" /><!-- default:5 Disparity tolerance for outlier removal (in pixels).  10-->
    <param name="outlier_flow_tolerance" value="5" /><!-- default:5 Flow tolerance for outlier removal (in pixels).  5-->
    <param name="multi_stage" value="1" /><!-- default:1 0=disabled, 1=multistage matching (denser and faster).  -->
    <param name="half_resolution" value="1" /><!-- default:1 0=disabled, 1=match at half resolution, refine at full resolution. -->
    <param name="refinement" value="1" /><!-- default:1 0=none, 1=pixel, 2=subpixel.  -->

    <!-- Bucketing params -->
    <param name="max_features" value="2" /><!-- default:2 Maximum number of features per bucket 15-->
    <param name="bucket_width" value="50.0" /><!-- default:50.0 Width of the bucket 60-->
    <param name="bucket_height" value="50.0" /><!-- default:50.0 Height of the bucket 60-->

    <!-- Mono params -->
    <param name="camera_height" value="1.0" /><!-- Height of the camera above the ground in meters. -->
    <param name="camera_pitch" value="0.0" /><!-- Pitch of the camera in radiants, negative pitch means looking downwards.  -->
    <param name="ransac_iters" value="2000" /><!-- Number of RANSAC iterations.  -->
    <param name="inlier_threshold" value="0.00001" /><!-- Fundamental matrix inlier threshold.  -->
    <param name="motion_threshold" value="100.0" /><!-- Threshold for stable fundamental matrix estimation.  -->
</node>
<node pkg="tf" type="static_transform_publisher" name="camera_broadcaster" args="0 0 0 1.5708 -1.5708 0 base_link camera 150" />

</launch>

i need the tranform for moving the camera in the right direction. if i dont use the tf and the identity is used i can only use the VO with a pointing to the ceeling camera.

maybe im missing something or not seeing something.

thanks for any help.

2016-02-10 11:24:40 -0500 asked a question viso2 stereo odometry with sample bagfile

Hello and good evening,

i am trying to run the stereo odometer from the viso2 package with the provided bagfile "test.bag" on the viso2 ros page. I do a remapping in the rosbag play to suit the right topics in my launch file.

rosbag play -l test.bag /stereo_forward/left/image_raw:=/camera/left/image_raw /stereo_forward/right/image_raw:=/camera/right/image_raw /stereo_forward/left/camera_info:=/camera/left/camera_info /stereo_forward/right/camera_info:=/camera/right/camera_info

after that i call my launchfile via

roslaunch stereo_VISO2_test.launch

<launch>

<arg name="camera" default="camera" /> <!-- The namespace where images are published -->

<param name="/use_sim_time" value="false"/>

<!-- **********Run the ROS package stereo_image_proc********** -->
<group ns="$(arg camera)" >
    <node pkg="stereo_image_proc" type="stereo_image_proc" name="stereo_image_proc">
    </node>
</group>

<!-- **********Run the viso2_ros package********** -->
<node pkg="viso2_ros" type="stereo_odometer" name="stereo_odometer" output="screen">
    <remap from="stereo" to="$(arg camera)"/>
    <remap from="image" to="image_rect"/>
    <param name="odom_frame_id" value="odom"/>
    <param name="base_link_frame_id" value="base_link"/>
    <param name="publish_tf" value="true"/>
</node>

</launch>

the problem i am facing now is that if i go into rviz and try to look at the motion published in the tf odom->base_link the motion is really bad and twitchy and does not look like the movement in the image_rect images at all. I allready checked the disparity of the rectified images and they look nice. "really nice camera used probably :-)"

Can somebody help me maybe with the parameters for the provided bag file or other setting i might be missing. Thanks for the help in advance and excuse my bad spelling/language. I give my best :-)

2015-10-06 06:08:05 -0500 received badge  Enthusiast
2015-09-24 08:04:58 -0500 received badge  Scholar (source)
2015-09-21 04:54:07 -0500 received badge  Famous Question (source)
2015-09-21 02:29:43 -0500 received badge  Teacher (source)
2015-09-21 02:29:43 -0500 received badge  Self-Learner (source)
2015-09-21 01:45:01 -0500 answered a question [SOLVED] xsens imu with robot_localization

Hello to everyone,

I solved the problem myself. In my tf tree was the transform from /imu_link->base_footprint and not as requested from /imu->/base_footprint. the request was hardcoded in a script for filtering the imu data and could not find /imu as it was not linked to the tf.

so i changed the code in the script to /imu_link and everything worked nicely.

Thanks for your help though :-)

Please mark the problem as solved. I dont have enough karma to do anything on here...

2015-09-21 01:41:29 -0500 received badge  Notable Question (source)
2015-09-18 01:16:02 -0500 received badge  Popular Question (source)
2015-09-17 06:36:41 -0500 received badge  Editor (source)
2015-09-17 02:25:44 -0500 asked a question [SOLVED] xsens imu with robot_localization

Hello to everybody, Im still new to ros and allready tryed really long to find a solution for my problem but all the things i tryed werent really helping.

The problem im having is that i want to connect an xsens-mti imu to my robot but cant get it to work under amcl together with the odom.

I tryed to implement the imu into the robot_localization and also tryed in the robot_localization to comment the odom0 and just use the imu0 data. The point is that the odom0 is working nicely and robot_localization is publishing data under /odom and /odometry/filtered. As soon as i comment the odom0 parts and just want to use the imu0 the /odom is still publishing but the /odometry/filtered which is as i think published by the robot_localization.launch is not showing any data. not from the odom and not from the imu.

You might need to know that my robot has a laser connected to it which, i think, till now computed a kind of imu to use together with the amcl. If im wrong please crrect me. The laser is no used under robot_localization. Just under amcl it is used. I also tryed to turn the laser off to see if the imu is working together with the odom but couldnt find a way to be sure about this because as far as i think my imu is not used in amcl.

The Imu itself works nicely. i can see the data published by the xsens under the topic /imu/data. But as soon as i try to use the /imu/data in my robot_localization as a value for the imu0, i get this error message:

WARNING: failed to transform from /imu->base_footprint for imu0 message received at 1442471593.559769106. No transform exists from source to target frame.

So my suggestion now is that the imu is not used by the robot localization if im not pushing the /imu/data to the robot localization. I mean if i am not using the /imu/data as the value for the imu0. Again if im wrong with this please let me know.

Also if you need more data to find out a solution for me please let me know which data and how to collect it.

Thank you very much for your help and excuse me if im confusing you or making it really hard to understand my problem :-)

2015-09-17 02:25:43 -0500 asked a question Xsens IMU problem

Hello to everybody, Im still new to ros and allready tryed really long to find a solution for my problem but all the things i tryed werent really helping.

The problem im having is that i want to connect an xsens-mti imu to my robot but cant get it to work under amcl together with the odom.

I tryed to implement the imu into the robot_localization and also tryed in the robot_localization to comment the odom0 and just use the imu0 data. The point is that the odom0 is working nicely and robot_localization is publishing data under /odom and /odometry/filtered. As soon as i comment the odom0 parts and just want to use the imu0 the /odom is still publishing but the /odometry/filtered which is as i think published by the robot_localization.launch is not showing any data. not from the odom and not from the imu.

You might need to know that my robot has a laser connected to it which, i think, till now computed a kind of imu to use together with the amcl. If im wrong please crrect me. The laser is no used under robot_localization. Just under amcl it is used. I also tryed to turn the laser off to see if the imu is working together with the odom but couldnt find a way to be sure about this because as far as i think my imu is not used in amcl.

The Imu itself works nicely. i can see the data published by the xsens under the topic /imu/data. But as soon as i try to use the /imu/data in my robot_localization as a value for the imu0, i get this error message:

WARNING: failed to transform from /imu->base_footprint for imu0 message received at 1442471593.559769106. No transform exists from source to target frame.

So my suggestion now is that the imu is not used by the robot localization if im not pushing the /imu/data to the robot localization. I mean if i am not using the /imu/data as the value for the imu0. Again if im wrong with this please let me know.

Also if you need more data to find out a solution for me please let me know which data and how to collect it.

Thank you very much for your help and excuse me if im confusing you or making it really hard to understand my problem :-)