Ask Your Question

fabian77's profile - activity

2018-01-20 00:35:30 -0600 received badge  Favorite Question (source)
2018-01-11 20:37:49 -0600 received badge  Famous Question (source)
2017-01-13 10:09:22 -0600 received badge  Good Question (source)
2017-01-13 10:09:17 -0600 received badge  Famous Question (source)
2017-01-13 10:09:17 -0600 received badge  Notable Question (source)
2017-01-13 10:09:17 -0600 received badge  Popular Question (source)
2016-08-15 18:16:30 -0600 received badge  Good Answer (source)
2016-06-29 10:56:25 -0600 received badge  Nice Answer (source)
2016-06-28 15:05:19 -0600 received badge  Teacher (source)
2016-06-28 14:58:50 -0600 answered a question Motors and RoS

In short words:

Yes you can use an arduino to work as a ROS node, there are good examples here wiki.ros.org/rosserial_arduino/Tutorials showing you, how you can use an arduino to subsribe and publish to messages f.e.

What you want to do if your motors are driven by an arduino, is to programm the arduino as a, i call it "base_controller". This way the arduino (running as ros-node) could subsribe to the topic "/cmd_vel" and then "translate the subsribed message into commands to steer your drives connected to it.

what i expect you to do first, is to go through the basic tutorials for ROS, so you understand the ROS infrastructure and what f.e. nodes are, what they do (subscribe and publish messages, in generall) and so on.

Hope this helps a bit, Fabian

2016-05-08 10:45:12 -0600 received badge  Famous Question (source)
2016-04-04 15:44:41 -0600 commented answer Openni or Freenect?

I will have a look at OpenNi too, do you have a clue if both drivers work in parallel? In my opinion installing freenect on ubuntu is better documented or easier then OpenNi, or i haven't researched that well enough. Must have a look, cause openni_camera and openni_launch seems better supported

2016-04-04 13:13:08 -0600 answered a question Openni or Freenect?

I will quote an answer which I found on Stackoverflow belonging to that question i found as i stood before that decision myself:

Libfreenect is mainly a driver which exposes the Kinect device's features: - depth stream - IR stream - color(RGB) stream - motor control - LED control - accelerometer

If does not provide any advanced processing features like scene segmentation, skeleton tracking, etc.

On the other hand, OpenNI allows generic access to Kinect's feature (mainly the image streams), but also provides rich processing features such as: - scene segmentation - skeleton tracking - hand detection and tracking - gesture recognition - user interface elements etc. but no low level controls to device features like motor/LED/accelerometer.

As opposed to libfreenect which AFAIK works only with the Kinect sensor, OpenNI works with Kinect but with other sensors as well like Asus Xtion Pro, Carmine, etc.

I am using freenect at the moment with success and haven't tried out out OpenNi so long. You can find some info on my projects website here (in german) and see what i have done so long with my kinect and ROS. Project is still under development.

On point i struggle at the moment is, that the ros-freenect-stack is just supported until ROS-Indigo and i want to update to Jade or Kinectic soon, if someone could help out here please contact me!

Fabian

2016-01-11 21:09:39 -0600 received badge  Famous Question (source)
2016-01-08 02:26:02 -0600 received badge  Notable Question (source)
2015-12-21 12:36:22 -0600 received badge  Popular Question (source)
2015-12-21 08:04:44 -0600 received badge  Nice Question (source)
2015-12-21 06:19:06 -0600 asked a question Question about contributing a package to ROS

I just finished a package for the serial dual motor driver board MD49, designed to use with two EMG49 motors.

The package can be used as a base_controller for a robot, driven with that board. It is for that publishing the encoder values and other information from the driver board and subscribing to /cmd_vel to steer drives.

So first i created a upstream-repo (the one i develop in) and a release repo of that package, as suggested. Than i "bloom-released" the package as described in this tutorial. The pull request was merged already.

First thing i stumbled about now, was the the following section in the README of the release repo, that was changed automatically by bloom:

Version of package(s) in repository md49_base_controller:

As you can see above the release repo is unknown, although in the pull request it shows up correct:

Increasing version of package(s) in repository md49_base_controller to 0.1.1-0:

I am afraid, that something went wrong or is there no reason to be worried?

2015-12-09 10:08:33 -0600 answered a question Complete n00b: Getting Started via ROS or MoveIt!?

Hi, i just can tell you, how i did. In the beginning ROS was a sealed book for me. Am also an hobbyist using ROS now for nearly 2 years and am still learning everytime doing something new. I just started with simple examples like publisher and subscriber, then headed over to write my own base_controller for my robot. The base_controller simply listens to /cmd_vel topics and "translates" them into serial commands to drive an motor-driver board with attached motors. Next task was to publish encoder-values from the drives, done this also in the base_controller node/package.

Next task was to teleop the robot with a usb-gamepad. Therefore I used the ROS- Package joystick_drivers which publishes to a topic /joy. Then i programmed a node which listens/subscribes to that topic and "translates" into corresponding /cmd_vel commands.

In the meantime i programmed a node that computes encodervalues to odometry data and publishing to /tf and /odom-topic. With an URDF- model of my robot i now can visualize the robot and its movements in RViz... and there is so much more to do with the project and possible with ROS that i am still exited and don't worry if i sometimes struggle with it.

In parallel i am working on webtools for my robot using rosbridge_suite and rosjs now. That way i stumbled into web- programming as well, thanks to my project and ROS :-)

There are much tutorials around the internet and on ROS Webpages itself, that after reading them several times i understand more and more as getting deeper into it. Just begin with...

My Setup is a embedded Linux Board on the robot,(pcduino) with ROS-Indigo and my Ubuntu Desktop-PC as workstation also with ROS-Indigo installed on it. ROS Master is running on the Desktop PC and connected via local wireless Network to the robot.

You can find my ROS Workspace on my ROS Repo on github. Maybe you can find some inspiration there.

Fabian

2015-12-09 07:07:47 -0600 received badge  Notable Question (source)
2015-11-26 14:00:56 -0600 received badge  Popular Question (source)
2015-11-01 09:13:16 -0600 commented question transform from map to base_link

to Q.3: The way i implemented base_footprint: I added base_footprint as parent link of base_link in the URDF now and in my base_odometry i publish the transform from odom to base_footprint now instead of base_link. Visualization rviz seems ok now again. But I don't know, if it's the recommended way.

2015-11-01 08:09:01 -0600 asked a question transform from map to base_link

Hi, i am still working as hobbyist on my tracked self-made robot using ROS as middleware and am a bit confused at the moment.

What i have managed so far:

  • There's a base_controller node subscribing to /cmd_vel and publishing wheelencoder- data in a custom message. Depending on /cmd_vel the robot moves around.
  • i can teleop my robot with a gamepad connected to a workstation using joystick_driver/ joy-Topics and computing that joy-Messages into /cmd_vel
  • i have a node base_odometry which computes odometry from wheel-encoders and publishes to the Topic /odom (Pose and Velocities) and to /tf (the transform from frame odom to frame base_link)
  • there is a package with an URDF, describing my robot with the frame base_link as parent
  • that way the robot is simulated in rviz and i can see the model moving according to the commands i send. the base_link is moving relatively to the odom link. so everything seems correct so far

The tf tree at the moment is the following:

odom -> base_link -> all other links from the urdf (as wheels and so on, which are more for visualizing then really needed)

  1. One question is now a bit widely asked: How should i go on with setting up navigation stack in general?
  2. One Idea was to set up a node that publishes the map link. And as i read in REP 105, "The transform from map to base_link is computed by a localization component". As i understood this component should first receive the transform from odom to base_link (which is published from my node base_odometry), and uses this information to broadcast the transform from map to odom. Is there a premade ROS package already released for that task? Or do i have to make it myself and is that necessary at all and i am on the wrong track right now...
  3. Next thing i am trying on at the moment is setting up package robot_pose_ekf, although i have no really use at the moment, lacking of other sensor data beneath wheel-encoders. But the idea was to set up a gps sensor in the future. But it's really confusing, because there's a new frame base_footprint. Should i just put that link into my urdf as parent from base_link? But as i do so, that's messing up my current setup for rviz and visualization in rviz isnt't working right anymore because of missing transforms from base_footprint to odom. Should i implement that transform in my node base_odometry instead of transform from odom to base_link or as a separate transform there or elsewhere?

As you and i see, i think i have reached a border from which on a hobbyist like me with no academic backround will have a hard way on going on... maybe someone could have a look at my coderepository on Github and have a look and giving tips how i should go on...

2015-10-14 10:28:35 -0600 received badge  Notable Question (source)
2015-09-30 09:41:39 -0600 answered a question [solved] Couldn't open Joystick /dev/input/js0

Maybe your joystick is on another port, not js0. you can test it with

ls /dev/input

plug and unplug your joystick and repeat the command, so you can see on which port (jsx) your joystick is detected. Then change default port in the launchfile.

2015-09-26 03:37:05 -0600 received badge  Notable Question (source)
2015-09-24 17:11:15 -0600 commented question How to connect to rosbridge_suite/roslibjs via WAN IP

seems as it is working now with no changes. don't even sure if i have to forward port 9090 in the router settings really. even works with my static adress i got over dyndns ( no-ip.com ). Thanks for your audience so long!

2015-09-24 17:07:42 -0600 received badge  Popular Question (source)
2015-09-24 13:14:47 -0600 asked a question How to connect to rosbridge_suite/roslibjs via WAN IP

Hi there,

i was trying out the following tutorial: http://wiki.ros.org/roslibjs/Tutorials/BasicRosFunctionality

roslaunch rosbridge_server rosbridge_websocket.launch was started with now errors

I have nginx as webserver and everything works fine from within the local network, after changing

var ros = new ROSLIB.Ros({
url : 'ws://localhost:9090'});

to my local IP in the corresponding html-file from the tutorial:

var ros = new ROSLIB.Ros({
url : 'ws://192.168.2.114:9090'});

That way everything is fine and working from within the local network. The html from the example is working, its listening to topic /listener and publishing to /cmd_vel.

So I changed the IP in the html-file to my WAN IP (the one provided from my Provider) and tried to reach the html-file over WAN-Adress from outside the local network instead. But that doesn't work with the html-file from the tutorial. (The Webserver is still reachable, so WAN IP must be correct).

I allready tryed to forward port 9090 within my router settings, like i did for port 80 from the webserver. With no success.

I also know, that my router doesn't support NAT-Loopback and so i'm connecting to the webserver from another network.

Does anybody know how i can connect to ROS with that tutorial from external network via internet or what has to be changed that it's working?

Thanks Fabian

2015-08-25 09:00:04 -0600 commented question Start roscore on a remote machine via launchfile

No sorry...atm i just switched over to launch roscore on the machine i am starting the launchfiles on. Was the easiest way for me for the moment. Just had to change all declarations (ROS_IPs) in the environment-files of all machines that way.

2015-08-24 16:31:34 -0600 received badge  Nice Question (source)
2015-07-21 10:08:46 -0600 received badge  Student (source)
2015-07-21 06:56:17 -0600 asked a question Start roscore on a remote machine via launchfile

Hi,

have it managed to start a node remote via roslaunch, question is now: If i do that roscore is started automatically, but local on the machine i start the launchfile on. Is it possible to tell ROS to start the roscore not on that local machine, but on the remote computer instead?

Greets

2015-07-11 03:15:56 -0600 received badge  Popular Question (source)
2015-07-10 13:45:42 -0600 commented answer how to pass argument to executable in launchfile

maybe i just try writing a scriptfile which passes the argument correct and trying to execute that script instead...

2015-07-10 13:45:26 -0600 commented answer how to pass argument to executable in launchfile

Hi, thanks for having a look, allready tried with args, but no success. Still getting error:

/home/robot/ROS-Groovy-Workspace/src/base_controller/scripts/setuart2: either 'on' or 'off' argument is required

2015-07-10 12:38:56 -0600 asked a question how to pass argument to executable in launchfile

Want to start a executable within a launchfile which needs an argument (its note a node), normally i would launch that executable like this from the commandline:

setuart2 on

where "on" is the argument.

I've put the executable into a scriptfolder into the workspace and am starting it like that in the launchfile:

<node name="setuart2" pkg="base_controller" type="setuart2" required="true" output="screen" machine="robotOS"/>

, seems that it is found through the launchfile, but i get the following error (what is clear, because the parameter "on" is missing:

killing on exit remote[robotOS.local-0]: /home/robot/ROS-Groovy-Workspace/src/base_controller/scripts/setuart2: either 'on' or 'off' argument is required

Now i've found no way to pass the parameter, tryed it throug parameters, but no success.

Can somebody help?

Thx a lot Fabian

2014-10-19 15:02:58 -0600 received badge  Famous Question (source)
2014-10-12 02:33:56 -0600 received badge  Enthusiast
2014-10-08 11:36:12 -0600 received badge  Notable Question (source)
2014-10-08 10:11:44 -0600 commented answer must ROS be installed on a remote system to run nodes on it?

Thx, the moment before you answered i decided to go for Groovy for now :-) Am installing Ubuntu 12.04 on a VM at the moment, because i stumbled into problems installing it on 14.04. Later I will have look for installing Groovy on my RPi, should be the easier way for now for me :-)

2014-10-08 09:36:37 -0600 received badge  Popular Question (source)
2014-10-08 09:19:41 -0600 answered a question must ROS be installed on a remote system to run nodes on it?

As i read some more information i'm thinking rosserial_embededlinux could be the right thing for my tasks...but its the same here, ROS has to be installed on both machines, the workstation and the target embedded linux system, is that right?

As seen, it should be easy to install ROS Groovy on the RPi from standard repositories, but would Groovy on the pi work togheter with Indigo on the workstation?

thx so long