ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

nvoltex's profile - activity

2021-01-31 02:58:43 -0500 marked best answer Rosserial_arduino use on an arduino uno with IMU (I2C/Wire library)

Hey! I have been testing the rosserial_arduino ( http://wiki.ros.org/rosserial ) in order to run a ROS node on arduino. I tested some of the examples and thought I had the hang of it. (In case it's useful: I'm running Ubuntu 14.04 on a macbook pro).

However I'm unable to get the arduino node to publish information from the IMU connected to the arduino UNO. The IMU is the MPU9150 and I'm using an implementation of the I2Cdev library found here: https://github.com/richards-tech/MPU9... (I have tried a different library, in order to understand if the problem was related to this specific library, but ended up with the same problem). If I only use the MPU9150 library, that is, if I don't try to use resserial, I'm able to get the IMU data on the arduino and print it on the serial monitor. However, if I try to use rosserial I'm unable to make the node work.

When I run the serial_node from rosserial I get the following output:

nmirod@nmirod:/$ rosrun rosserial_python serial_node.py _port:=/dev/ttyACM1 _baud:=57600
[INFO] [WallTime: 1422735802.267338] ROS Serial Python Node
[INFO] [WallTime: 1422735802.270908] Connecting to /dev/ttyACM1 at 57600 baud
/home/nmirod/catkin_ws/src/rosserial/rosserial_python/src/rosserial_python/SerialClient.py:336: SyntaxWarning: The publisher should be created with an explicit keyword argument 'queue_size'. Please see http://wiki.ros.org/rospy/Overview/Publishers%20and%20Subscribers for more information.
self.pub_diagnostics = rospy.Publisher('/diagnostics', diagnostic_msgs.msg.DiagnosticArray)
[ERROR] [WallTime: 1422735819.375623] Unable to sync with device; possible link problem or link software version mismatch such as hydro rosserial_python with groovy Arduino

Notice that the SyntaxWarning appears even in the tutorial examples. After some testing it seems that this problem is related to the use of the Wire library. Commenting the functions that perform the initialization and reading of the IMU, I' able to get a msg on the desired topic (although constant). However, if I try to run the node as it is, I get the "Unable to sync with device" error.

Is this problem associated with the "Wire library"/I2C communication? Can you help me out?

EDIT: Is it possible to use Serial.print()'s in combination with rosserial? When I wrote this sketch I had the caution to remove all debug prints, to be sure it wouldn't scramble the communication with ros. However, when I was out of options I decided to try to use some Serial.print()'s for debugging and it seems that using Serial.begin(57600) solved the problem (same baud rate as the node).

Althought the problem seems to be solved I still would like to understand what's going on so that if something similar happens down the road I know what to do.

EDIT2: Here is the code:

#include "ros.h"
#include "rospy_tutorials/Floats.h"

#include "freeram.h"
#include "mpu.h"
#include "I2Cdev.h"

ros::NodeHandle  nh;

float aux[] = {9, 9, 9, 9 ...
(more)
2021-01-31 02:58:42 -0500 received badge  Teacher (source)
2021-01-31 02:58:42 -0500 received badge  Self-Learner (source)
2020-11-27 06:58:04 -0500 received badge  Favorite Question (source)
2019-08-26 14:19:51 -0500 received badge  Good Question (source)
2018-12-03 20:06:56 -0500 received badge  Stellar Question (source)
2018-02-08 09:57:18 -0500 received badge  Notable Question (source)
2018-02-08 09:57:18 -0500 received badge  Popular Question (source)
2018-01-24 19:52:41 -0500 marked best answer Optical flow with bottom camera on aerial vehicle

I'm interested in obtaining an estimation of the linear velocity of the vehicle using optical flow, so which are the best packages for optical flow using the bottom camera of an aerial vehicle? (the camera is a simple webcam).

Or the best approach is to implement optical flow using OpenCV?

Notice that I am using an Odroid X2 (ARM) on the aerial vehicle, so I am searching for something that wont be too demanding for the processor.

2017-11-13 14:19:02 -0500 received badge  Nice Question (source)
2017-07-01 05:13:57 -0500 received badge  Nice Question (source)
2017-03-22 14:34:22 -0500 received badge  Favorite Question (source)
2016-07-18 10:18:12 -0500 received badge  Famous Question (source)
2016-07-08 10:59:23 -0500 received badge  Popular Question (source)
2016-05-27 07:15:54 -0500 received badge  Famous Question (source)
2016-04-09 05:51:54 -0500 received badge  Notable Question (source)
2016-03-31 10:43:05 -0500 asked a question webcam on exterior conditions

Hey there,

Firstly, I'm aware this is not a completely ROS related question, however I have used this web camera before on the exterior with no problems, so I'm wondering if I might be missing something on my setup.

Basically the problem is related with the "saturation" of the image when on the exterior (due to daylight). The image gets either completely white or really "whitish". I'm using the usb_cam ROS package to get the images from the camera.

The launch file I'm using right now is:

<launch>
        <node name="usb_cam" pkg="usb_cam" type="usb_cam_node">
            <param name="video_device" value="/dev/video1"/>
            <param name="image_width"  value="640"   />
            <param name="image_height" value="480"   />
            <param name="pixel_format" value="mjpeg" />
            <param name="auto_focus"   value="true"  /> 
            <param name="framerate"    value="30"    /> 
            <param name="camera_frame_id" value="/camera" />
        </node>
</launch>

Initially I didn't had the auto_focus enabled, but added it on an attempt to solve the problem. The camera is a "Creative Live! Cam Socialize HD 1080". I have used this camera on the exterior on pretty similar conditions with no problems, however I can't seem to find a way to use it now. Am I missing something?

Once again sorry for the question not being totally ROS related, but I'm pretty lost about this. Thanks in advance

2016-03-31 10:10:51 -0500 received badge  Popular Question (source)
2016-03-22 21:44:34 -0500 received badge  Famous Question (source)
2016-03-10 15:08:31 -0500 commented question image_view symbol lookup error

You are right, I'm sorry I forgot to add the command. Thanks for warning me.

2016-03-10 06:55:38 -0500 asked a question image_view symbol lookup error

Hi there,

Whenever I try to use image_view to visualize a topic published by usb_cam with the following command:

rosrun image_view image_view image:=/usb_cam/image_raw

I get this:

init done 
opengl support available 
[ INFO] [1457614339.953957839]: Using transport "compressed"
/opt/ros/indigo/lib/image_view/image_view: symbol lookup error: /opt/ros/indigo/lib/image_view/image_view: undefined symbol: _ZN9cv_bridge18cvtColorForDisplayERKN5boost10shared_ptrIKNS_7CvImageEEERKSsbdd

The same happens even with the "raw" transport:

init done 
opengl support available 
[ INFO] [1457614438.464431134]: Using transport "raw"
/opt/ros/indigo/lib/image_view/image_view: symbol lookup error: /opt/ros/indigo/lib/image_view/image_view: undefined symbol: _ZN9cv_bridge18cvtColorForDisplayERKN5boost10shared_ptrIKNS_7CvImageEEERKSsbdd

Am I doing something wrong? has anyone got similar errors?

Thanks in advance.

2016-02-10 11:28:33 -0500 asked a question Correct setup for hector_object_tracker

I'm trying to use hector_object_tracker, to track some targets from aerial images (captured from a down facing camera on an aerial vehicle).

I already have a node for detection of the targets as well as extracting their position to the world reference frame.

From what I understood of the wiki page, I should publish this information to the "worldmodel/pose_percept" topic, which uses messages of the type: "hector_worldmodel_msgs/PosePercept".

My problem is connected to the PerceptInfo msg: Since I only have one type of target the class information is easy to fill (using class_support=1). However I don't understand how I should handle the object information on that message. Furthermore, from inspection of hector_heat_detection node, that section seems to not be filled.

Thanks in advance

2016-01-28 10:24:24 -0500 commented answer Multiple static target position estimate

One of my problems is that I can't seem to understand how to setup the packge properly and there are no tutorials or extra information. Do you know of place where I can read more in depth about that package?

2016-01-28 10:12:58 -0500 received badge  Notable Question (source)
2016-01-23 07:28:22 -0500 commented answer Multiple static target position estimate

What I am searching for is more a solution to the position estimation (noise/outlier filtering and estimation to be precise), since I already detect and convert the dected position to the world referencial frame. However hector_object_tracker, doesn't seem to allow access to the filtering side alone

2016-01-23 07:21:47 -0500 received badge  Popular Question (source)
2016-01-22 12:11:55 -0500 asked a question Multiple static target position estimate

An aerial vehicle captures images of the ground, using a down facing camera, in order to detect some static ground targets. Using opencv I created a node that detects those targets and converts their position from the image frame to the world frame. Since all targets are on the ground, each sample is a 3D vector (x,y,0). Note: Using robot_localization I also have the attitude and heading of the aerial vehicle.

What's the best approach to estimate the position of the targets from multiple readings?

Is there a package that implements something to this effect? This is basically a problem of noise removal and estimation, so strategies one of the possible strategies is using clustering.

2016-01-19 09:46:42 -0500 asked a question Clustering of ground targets detected using opencv (within ROS)

Using opencv I created a node that detects ground targets and converts its position in order to the world reference frame. Each sample is a 3D vector (x,y,0).

Initially, I was only using a really simple strategy where I assumed that if a new sample was within a certain range of another, then it meant it belonged to the same ground target.

However, I would like to test some clustering algorithms to group the samples into detected targets. How should I do it inside the ROS environment? Are there any packages available that do something similar to what I want to achieve? Any kind of direction would be appreciated. Thanks

2016-01-19 09:34:54 -0500 received badge  Notable Question (source)
2015-09-25 10:25:56 -0500 asked a question imu_filter_madgwick with magnetometer

I'm trying to use the imu_filter_madgwick while feeding it magnetometer data (use_mag=true). However, even though the filter seems to be working properly before adding the magnetometer data, after the addition it stops behaving as expected.

I even calibrated the data for Hard and Soft iron effects using some ellipsoid fitting algorithms, but it still doesn't work. It's as if the information from the magnetometer it's making the orientation to stay "locked" around a certain angle.

I tested the magnetometer by trying to get a heading using the information from the x and y axis, and the heading I obtained was pretty close to the correct one.

Any idea what might be causing this problem? Since I need a global orientation, it's pretty important I'm able to use the magnetometer data.

2015-07-10 10:37:50 -0500 commented answer Setting robot_localization properlly - 2nd odom not working

What i meant when I asked if I should fuse the roll and pitch is that, due to the vehicle being sustained in the air thanks to a ballon, changes on roll and pitch will be small and due to noise from the rotors (or wind). That being said, those changes won't have much effect on the overhall movement

2015-07-10 05:08:29 -0500 marked best answer ROS nodes for aerial vehicle localization

Hey there,

I'm developing an aerial vehicle which is equipped with a GPS, IMU and bottom camera (pointed at the ground for target tracking). I was looking for a way to fuse the data, with two objectives:

  • Obtaining a coordinate for the "drone" localization (I could assume it starts at (0,0))

  • Extrating a position of the detected targets (on the "local coordinate system" and GPS information)

Which nodes will prove useful for this application?

Thanks in advance

2015-07-09 05:26:36 -0500 commented answer Setting robot_localization properlly - 2nd odom not working

One of the problems I see is that I don't have the heading associated with that velocity and I'm feeding it as the linear.x. However if the wind conditions make the vehicle move sideways, that won't be compensated by the filter.

2015-07-09 04:01:11 -0500 commented answer Setting robot_localization properlly - 2nd odom not working

Oh ok, thanks! Could you give me some insight on the doubts I posted in the question? Another thing, the GPS I'm using is a garmin 18x, which besides the longitude, latitude and altitude also outputs a "speed" parameter. Is it a bad idea to use this as the velocity on the new odometry msg?

2015-07-09 03:48:58 -0500 received badge  Famous Question (source)
2015-07-08 19:55:37 -0500 commented question Setting robot_localization properlly - 2nd odom not working

The IMU does have a magnetometer and I think I have set both yaw_offset and magnetic_declination_radians properly. (I haven't changed them since my last question)

2015-07-07 12:53:46 -0500 commented question Setting robot_localization properlly - 2nd odom not working

I updated the question with some new info and corrected some things. I still have the doubts from before, but since I corrected the odometry msg I'm getting a strange behavior from the filter (explained in the question). Thanks for the help!

2015-07-07 12:44:47 -0500 edited question Setting robot_localization properlly - 2nd odom not working

Since this post I have added a magnetometer to the system in order to get a world-referenced yaw/heading estimate as well as a barometer for better altitude estimations.

First problem [solved] However, I'm still doing some tests on ground, so what I have been doing lately is to feed a fixed altitude (1.2m) as a new odometry msg (nav_msgs/Odometry), instead of using the altitude calculated from the barometer. After correcting a small problem with the msg stamp the altitude started to be used by the filter. Since then I have added a velocity to the same odometry msg and it seems to be working. I'm going to edit the post to reflect this changes.

Right now the launch file used to test the "fake" odometry msg is:

<launch>

<node pkg="tf" type="static_transform_publisher" name="imu_tf" args="0 0 0 0 0 0 1 /base_link /imu 20"/>

<node pkg="robot_localization" type="ekf_localization_node" name="ekf_localization" clear_params="true">
  <param name="frequency" value="20"/>  
  <param name="sensor_timeout" value="5"/>  
  <param name="two_d_mode" value="false"/>

  <param name="map_frame"       value="map"/>
  <param name="odom_frame"      value="odom"/>
  <param name="base_link_frame" value="base_link"/>
  <param name="world_frame"     value="odom"/>

  <param name="odom0" value="/imu/altitude"/>
  <param name="imu0"  value="/imu/data"/> 


  <rosparam param="odom0_config">[false, false, true,
                                  false, false, false, 
                                  true , false, false, 
                                  false, false, false,
                                  false, false, false]</rosparam>

  <rosparam param="imu0_config">[false, false, false, 
                                 true , true , true, 
                                 false, false, false, 
                                 true , true , true ,
                                 true , true , true ]</rosparam>

  <param name="odom0_differential" value="false"/>
  <param name="imu0_differential"  value="false"/>

  <param name="imu0_remove_gravitational_acceleration" value="true"/>

  <param name="odom0_relative" value="false"/>
  <param name="imu0_relative"  value="false"/>

  <param name="print_diagnostics" value="true"/>

  <!-- ======== ADVANCED PARAMETERS ======== -->

  <param name="odom0_queue_size" value="2"/>
  <param name="imu0_queue_size"  value="10"/>


       <rosparam param="initial_estimate_covariance">[1e-9, 0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    1e-9, 0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    1e-9, 0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    1e-9, 0,    0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    1e-9, 0,    0,    0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    1e-9, 0,    0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    1e-9, 0,    0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    1e-9, 0,    0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    0,    1e-9, 0,     0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    0,    0,    1e-9,  0,     0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    0,    0,    0,     1e-9,  0,     0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     1e-9,  0,    0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     1e-9, 0,    0,
                                                      0,    0,    0,    0,    0,    0,    0,    0,    0,    0,     0,     0,     0,    1e-9, 0 ...
(more)