ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

Tirgo's profile - activity

2019-04-25 18:12:42 -0500 marked best answer Localization based on a laserscan

Hello,

i have a question regarding the localization of a mobil robot. Unfortunately my robot is not able to offer wheel odometry. I am looking for an approach to localize my robot in a given map just based on a laserscan. I read that hector_mapping can be used in this case, as it includes a laser-scanmatcher, which works well as stand alone. Is something similar avialable without the whole SLAM algorithm behind? I just need the "L" ;). My robot has the constraint of relativly low computing power, why i also can't use visuel odometry.

Thx for your answers

2018-02-01 23:57:58 -0500 marked best answer Hector_Slam Imu How to set up the tf tree

Hi all

I'm currently working with hector_slam for creating maps. I have written a package for my imu which offers me a rotation matrix, so that i can calculate roll, pitch and yaw.

My question is: How should i connect them to the hector_slam tf tree?

Roll, pitch to base_stabilized->base_link and Yaw for orientation to base_footprint->base_stabilized?

or can i put them all together to base_stabilized->base_link? Makes it any difference?

Thx for your response

2018-01-11 20:19:21 -0500 marked best answer How to control pitch, roll, yaw with mavros

Hello,

i am quiet new to the pixhawk, but quite familiar with ROS. I am searching for some informations how to control the pixhawk via mavros. I am using the px4 firmware. My plan is to use the altiude_hold_mode, so that the multicopter hovers at a specific height. Additionaly i want to control pitch, roll and yaw angles of the multicopter via ROS. A bit like a 2D control in 3D space but at a specific height.

I am a bit lost how to setup my pixhawk with the external computer (an odroid fixed to the multicopter).

I found this tutorial [ https://pixhawk.org/dev/ros/mavros_of... ] . So i can control the copter with help of standard ROS messages. More specific i can control: position, attitude, velocity and accelration. Here are my questions: In order to control the copter via mavros, it has to be in offboard_mode. Would this conflict with the altitude_hold_mode? And which message should i use to control roll/pitch/yaw? In my little project the copter has to fly with specific "angles".

Thx for your help.

EDIT1: The connection between the odroid and the pixhawk is working. I can handle the streams (baudrate, Hz) aswell. I am a bit confused about the whole mavros-setpoint part. I will try to order a bit ;).

  1. If i use setpoint_position, how is the actual position of the copter calculated? I think it depends on my sensor setup right? For example distance sensor+GPS. Unfourtanetly i can't use a GPS or Optical Flow. Therefore this method will not work or?

  2. The setpoint_attitude seems more like something that would fit, but it can't offer a stabil z position right? So i would have to read out the distance sensor and regulate the throttle by myself correct?

  3. I wonder if it is easier to use the standard altitude_hold mode of the pixhawk. Additionally i would read out the imu data of the pixhawk and regulate pitch, roll, yaw over mavros/rc/override. Would this also be a possibility?

2017-10-18 02:45:38 -0500 received badge  Famous Question (source)
2017-07-31 10:12:09 -0500 received badge  Famous Question (source)
2017-03-02 04:17:17 -0500 received badge  Notable Question (source)
2016-09-14 14:39:47 -0500 marked best answer husky bringup on an ARM platform

Hello everybody,

i am currently trying to get my husky indigo setup working on an arm machine. My problem is the husky_node of the husky_base package. Running it, i get the following error:

EXCEPTION: TransportException 3: Unacknowledged send
[ERROR] [1438708451.166274559]: Error configuring velocity and accel limits: Unacknowledged send

The port is set correctly and also the port permission are r+w.

Thank you for your help.

2016-09-14 14:38:55 -0500 received badge  Famous Question (source)
2016-05-02 22:02:02 -0500 received badge  Taxonomist
2016-04-18 04:00:18 -0500 commented question Can moveite cope with mimic joint (parallel kinematic) and/or liniear constraints for planning and trajectory in IK

Hi, were you able to set it up? I am trying to do something similar. I have a robot with a similar Kinematic, but i only need one mimic joint. I am getting the "has a mimic joint" warning and that the dynamics solver is not initialized. Would be nice to share your experience.

2016-04-08 04:49:00 -0500 commented answer Hector_Slam mapping Hokuyo

Depends on your hardware and the enviroment, but yes it can also handle rotations. If you have a short range laser scanner or an enivorement without many charactaristica you are running into problems.

2016-03-28 14:04:25 -0500 received badge  Notable Question (source)
2016-03-17 13:57:29 -0500 marked best answer SICK Scanner PLS200-113

Hello all,

I'm currently trying to get the SICK PLS200 Laserscanner to work. Unfortunately I'm not able to do it. The ROS provided sicktoolbox_wrapper does only support LMS200, LMS291 right? Because I'm getting an error following the tutorial.

So my question is:

Does someone already worked with the SICK PLS200 under ROS and where can i find a working solution for it?

I would be happy about some help

Best Regards

2016-03-12 16:24:06 -0500 received badge  Famous Question (source)
2016-03-06 12:40:40 -0500 received badge  Famous Question (source)
2016-01-22 19:55:00 -0500 received badge  Popular Question (source)
2016-01-04 07:02:25 -0500 received badge  Notable Question (source)
2016-01-04 07:02:25 -0500 received badge  Popular Question (source)
2015-12-17 22:02:05 -0500 marked best answer Hector_Slam mapping Hokuyo

Hello

I'm trying to create something similar to this http://www.youtube.com/watch?feature=player_embedded&v=F8pdObV_df4, but i'm not able to create constant good maps.

I'm using hector slam with the hokuyo URG-04LX and an imu for roll/pitch.(so no odometry) Some (indoor) maps are perfect, but if i try it again the map sometimes drifts and the position in rviz becomes weird.

So my question is: Hector_slam creates its own odometry right? Or do i have to add odometry for proper map results? Or is my laser scanner to weak for hector, so that i need something like hokuyo UTM-30LX?

Thx for your response

2015-12-09 11:41:50 -0500 received badge  Self-Learner (source)
2015-12-09 11:41:50 -0500 received badge  Necromancer (source)
2015-12-09 09:09:14 -0500 received badge  Famous Question (source)
2015-12-09 02:34:53 -0500 answered a question nao walker indigo

Works fine now. It was fixed!

2015-12-09 02:34:39 -0500 received badge  Famous Question (source)
2015-12-09 02:33:55 -0500 answered a question nao_teleop not working

Everything works fine now. It was fixed!

2015-12-09 02:30:16 -0500 answered a question avt_vimba_camera frame rate issue

Turns out that the Odroid XU3 includes only a fast ethernet controller. Tested it also with an odroid XU4, which is limited by the MTU packet size of 4078.

2015-12-09 02:29:05 -0500 answered a question camera_calibration circle grid pattern

Solved it: I was just to impatient. It takes like 1 to 2 minutes in order to find my complete circle grid pattern. I just aborted it too early ;-).

2015-12-09 02:00:52 -0500 received badge  Self-Learner (source)
2015-12-09 02:00:52 -0500 received badge  Teacher (source)
2015-12-09 02:00:52 -0500 received badge  Necromancer (source)
2015-12-09 01:58:19 -0500 received badge  Notable Question (source)
2015-12-09 01:56:25 -0500 commented question avt_vimba_camera frame rate issue

Ok good to know. Unfourtanetly i don't have enough karma to reopen it. But i will remember it for the future.

2015-12-09 01:53:47 -0500 received badge  Notable Question (source)
2015-12-09 01:53:13 -0500 commented answer prosilica_camera on an ARM plattform

ANSWER: It turns out that the ethernet controller of the odroid is the bottleneck. It is only capable of a MTU packet size of around 4078.

2015-12-07 02:37:10 -0500 received badge  Popular Question (source)
2015-12-07 02:18:42 -0500 received badge  Famous Question (source)
2015-12-07 02:18:13 -0500 edited question avt_vimba_camera frame rate issue

Hello everybody,

i am currently using the package avt_vimba_camera to use a Manta G609B. My ROS indigo system is running on an Odroid XU3. Had some troubles to get everything running, but now the driver seems to work. I installed it from source and added some needed arm libraries. The only issue is, that the driver will automatically lower the frame rate of the camera depending on the chosen resolution.

[ INFO] [1440425646.889163949]: New PixelFormat config (manta) : 
    PixelFormat : Mono8 was Mono8
[ INFO] [1440425646.899412146]: Asking for feature AcquisitionFrameRateLimit with datatype float and value 8.86603
[ WARN] [1440425646.900022564]: Max frame rate allowed: 8.86603. Setting 8...
[ INFO] [1440425646.943016690]: New Acquisition and Trigger config (manta) : 
    AcquisitionMode         : Continuous was Continuous
    AcquisitionFrameRateAbs : 8 was 12
    TriggerMode             : On was On
    TriggerSource           : FixedRate was FixedRate
    TriggerSelector         : FrameStart was FrameStart
    TriggerActivation       : RisingEdge was RisingEdge
    TriggerDelayAbs         : 0 was 0
[ INFO] [1440425646.961006784]: [camera]: Starting continuous image acquisition...(manta)

Even it is automatically tuned down to 8 frames per second, a test showed just a frame rate of around 4 Hz. Is that an issue working on an arm architecture? I had never such a problem an my normal PC. Would appreciate some help with that.

Tested it again with the same launch file (resolution and frame rate, etc.):

PC -> 16 fps Odroid XU3 -> 2 fps

ANSWER: Turns out that the Odroid XU3 has only an fast ethernet controller. Tested it also with an odroid XU4, which is limited by the MTU packet size of 4078.

2015-12-07 02:15:40 -0500 edited question prosilica_camera on an ARM plattform

Hello,

i am currently trying to get a manta 609B from allied vision tech to work an an arm platform. Unfortunately no binary install is available yet. If i try to install it from source, the whole compilation process works well, but as soon as i try to start camera driver i get the following error:

[ERROR] [1439993319.314367411]: Failed to load nodelet [/prosilica_driver] of type [prosilica_camera/driver]: Failed to load library /home/odroid/test_ws/devel/lib//libprosilica_nodelet.so. Make sure that you are calling the PLUGINLIB_EXPORT_CLASS macro in the library code, and that names are consistent between this macro and your XML. Error string: Could not load library (Poco exception = libPvAPI.so: cannot open shared object file: No such file or directory)

I checked the the libraries. Both libprosilica_nodelet.so and libPvAPI.so can be found in the /devel/lib folder.

Thank you for your answers.

EDIT1: Changed to package avt_vimba_camera and added the needed arm libraries. Result:

/home/odroid/test_ws/src/avt_vimba_camera/src/avt_vimba_camera.cpp: In constructor 'avt_vimba_camera::AvtVimbaCamera::AvtVimbaCamera()':
/home/odroid/test_ws/src/avt_vimba_camera/src/avt_vimba_camera.cpp:87:84: warning: delegating constructors only available with -std=c++11 or -std=gnu++11 [enabled by default]
 AvtVimbaCamera::AvtVimbaCamera() : AvtVimbaCamera(ros::this_node::getName().c_str()) {
                                                                                    ^
g++-4.8.real: internal compiler error: Killed (program cc1plus)
Please submit a full bug report,
with preprocessed source if appropriate.
See <file:///usr/share/doc/gcc-4.8/README.Bugs> for instructions.
make[2]: *** [avt_vimba_camera/CMakeFiles/stereo_camera_node.dir/src/nodes/stereo_camera_node.cpp.o] Error 4
make[2]: *** Waiting for unfinished jobs....
Linking CXX executable /home/odroid/test_ws/devel/lib/avt_vimba_camera/mono_camera_node
make[1]: *** [avt_vimba_camera/CMakeFiles/stereo_camera_node.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
[100%] Built target mono_camera_node
make: *** [all] Error 2
Invoking "make -j8 -l8" failed

EDIT2: I installed the AVT Software - Vimba v1.3 ARM_Hard-Float. Now i can use my Manta G609B with the package avt_vimba_camera on my odroid XU3 . Unfortunately there are some frame rate restrictions. Depending on the resolution it will lower the frame rate automatically:

[ INFO] [1440425646.889163949]: New PixelFormat config (manta) : 
    PixelFormat : Mono8 was Mono8
[ INFO] [1440425646.899412146]: Asking for feature AcquisitionFrameRateLimit with datatype float and value 8.86603
[ WARN] [1440425646.900022564]: Max frame rate allowed: 8.86603. Setting 8...

Any ideas?

EDIT3: So i realized that the Odroid XU3 has just a FAST Ethernet controller... But no problem, i switched to an XU4, which has an GIG Ethernet Controller. But as always i got into more troubles: I am using the same launch file on the XU3, even the same microsd card!! So basically same system! The camera_node crashes after some time and it shows a really strange behavior... If i don't subscriber to the camera topic it runs forever. If i subscribe, for example with rostopic hz to check the frame rate, it crashes after some time. If i use image_view, i get the following error message from time to time:

[ERROR] [1356999010.655579398]: Unable to convert 'mono8' image to bgr8: 'Image is wrongly ...
(more)
2015-12-07 02:10:59 -0500 received badge  Famous Question (source)
2015-12-01 03:23:18 -0500 asked a question CANopen interface to read out an encoder

Hello,

I would like to know if there is an package available to read out CANopen based sensor like an encoder via ROS. I found a couple of packages regarding CANopen, but it seems they are more focused on motors. Has someone expierience in reading out CANopen based sensors via ROS and could help me with a short introduction into it?

Thanks for your help.

2015-11-06 04:31:05 -0500 received badge  Notable Question (source)
2015-11-06 04:31:05 -0500 received badge  Popular Question (source)
2015-10-13 03:19:10 -0500 received badge  Notable Question (source)
2015-10-13 03:19:05 -0500 received badge  Popular Question (source)
2015-10-13 03:19:05 -0500 received badge  Notable Question (source)
2015-09-24 07:35:16 -0500 edited question USB Cam ImagingSource Problem

Hello, I am currently trying to get the DMK 42BUC03 link_to_camera from ImagingSource to work with an ROS package.

I tried usb_cam:

[ERROR] [1443095249.460258490]: Webcam: expected picture but didn't get it...

I tried uvc_cam:

opening /dev/video0
pixfmt 0 = 'GREY' desc = 'Greyscale 8-bit (Y800)'
  discrete: 1280x960:   1/30 1/25 1/15 1/10 
  discrete: 720x480:   1/60 1/30 1/25 1/15 
  discrete: 640x480:   1/60 1/30 1/25 1/15 
  int (Gain, 0, id = 980913): 34 to 255 (1)
  int (Exposure (Absolute), 0, id = 9a0902): 1 to 300000 (1)
Segmentation fault (core dumped)

i tried the normal vlc player for ubuntu:

I get an image but it says:

[0x7fcd5401b198] main blend error: blending RGBA to GREY failed
[0x7fcd5401b198] blend blend error: no matching alpha blending routine (chroma: RGBA -> GREY)

The camera is set to uvc mode and it is an USB2.0 camera. It is detected correctly from my linux system. Thx for your help.