ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
2

OWI 535USB Robotic arm

asked 2011-07-09 02:13:17 -0600

xalu gravatar image

updated 2014-01-28 17:10:00 -0600

ngrennan gravatar image

I am looking for information on how I can get this arm to work with ROS. I am new to C and C++ but have programmed a little with Java and PHP. I'd like to get the arm to function with some other ROS Arm scripts if possible. I am just wondering if someone could help point me in the right direction.

From the manufacture:

The RAI-Pro USB robotic arm interface kit connects the OWI's 535 Robotic Arm Edge to a Windows personal computer USB port. The interface software allows real time interactive control, plus contains a built-in interactive script writer. A user may write a script that contains up to 99 individual robotic arm functions (including pauses) into a single script file. Script files may be saved and loaded from disk just like any other standard computer file.

edit retag flag offensive close merge delete

2 Answers

Sort by » oldest newest most voted
3

answered 2011-07-10 23:58:24 -0600

Just a warning ahead: this is going to be a lot of work.

Step 1: controlling the arm from code

You need a way to control the motors from code, ideally under Linux (since ROS is best-supported under Linux). Luckily for you, somebody already has reverse-engineered the USB protocol, so you will probably be able to use that.

First, I would write a small test program that can move any motor of the arm to a desired joint angle and publishes JointState messages for each joint on the /joint_states topic (ignore the gripper for now, that's going to be easy later). I'm not sure how well that's going to work, since the arm doesn't measure the joint angles, and the only available commands seem to be "start/stop movement in some direction". You'll probably have to time the commands and compute the resulting joint angles.

Step 2: create an URDF model of the arm

This can be done in parallel to the first step; follow the URDF tutorials. It would be great if you had CAD models of your arm, but for the beginning just using some cylinders/boxes instead should do. Ideally, after completing both steps, you should be able to control the joints of the arm and visualize the result in RViz. You could even point a Kinect camera at your arm, add that point cloud to RViz and check how well the actual arm movements and the URDF model match up.

Step 3: arm navigation

The first two steps should have kept you busy for some while. :-)

By now, just by using the joint angles, you should already be able to do some interesting stuff with the arm. Also, you should have a feeling for the arm, and what can be done with it and what not. If you want to go further, you could try to follow this tutorial to get arm navigation running. This would enable you to pick and place objects recognized in a point cloud (e.g. from a Kinect) while avoiding obstacles. However, this is going to be a real lot of work, and I'm not sure if your arm is maybe too inaccurate (especially when the joint angles can only be computed using some dead reckoning) to do that. Perhaps you need to go for some simpler solution.

Feel free to ask more questions. I'm really looking forward to see how your project is going.

edit flag offensive delete link more

Comments

Well I will have to try it. The arms is just too cheap to not at least try. $70 including the usb controller. I'll update when/if I have I have made progress than add it to the wiki
xalu gravatar image xalu  ( 2011-07-12 03:01:53 -0600 )edit

Has anyone had any success creating a Gazebo simulation for the robot or URDFs for the robot?

Thanks!

Abhinavgandhi09 gravatar image Abhinavgandhi09  ( 2020-11-05 11:09:26 -0600 )edit
3

answered 2011-07-15 08:05:25 -0600

qweeg gravatar image

Hi, I recently bought one of those robotic arms too. I used a PS3 six axis controller attached to the section of the arm that controls the angle of the gripper as an accelerometer to provide positioning feedback for the motors.

I'm not sure if your arm is maybe too inaccurate (especially when the joint angles can only be computed using some dead reckoning) to do that.

Martin Gunther

He is completely right. The arm is very inaccurate and the software provided to program the arms movements highlights that. I ran a few commands I saved and most of the time the robot was way off its original position when reversed and the joints tended to get to their maximum angles and grind their gears.

To use the PS3 controller I used a gamepad keyboard mapping program called MotionJoy. This enabled me to read movement when the controller tilted in different directions and map the keys to the keys that control the robot in the software provided with the kit. I was then able to control the robot by tilting the controler in the same direction (controlling the main arm motor and the base direction motor).

I then changed the mapped keys to control the middle arm section. I set the key to command the arm to move up to the downwards tilt of the controller and vice versa so that the arm stayed in a certain position.

In a previous project I used threads in C++ to be able to catch keys pressed for a game. I think that if you want a quick, kinda bodged solution you could use the same technique to control the arm. You could catch keys from the controller input and simulate keys to control the robot. So basicly using the controller just to tell the motors how far they have gone. This would provide very accurate movement of the robot for other tasks.

I am also trying to take it one step further by attaching a webcam to the head of the gripper section and using openCV to process the images to enable edge detection and colour blob tracking. This would be useful if you wanted your robot to automaticaly find certain objects and pick them up, sorting piles of objects for instance.

And even one step further.... lol, I think after all that I am going to use openGL to provide a 3D model of the robot that would move in real time using the controllers feedback. And for the icing on the cake, adding another webcam next to the first cam I could use openCV feature detection on both the images to distinguish how far an object is away from the cameras using the feature position difference in the images. http://en.wikipedia.org/wiki/Triangulation_%28computer_vision%29

So using a couple of webcams, a ps3 controller and the software to interface it all you could end up with a robot that can avoid objects and map objects positions into ... (more)

edit flag offensive delete link more

Comments

Hey qweeg I would love it if you share your progress and code. Sounds like a lot of great ideas. Have you thought about using one of those cheaper 3-d cameras. As I understand it, they are able to give stereo vision which may be the solution to the dual cameras on the arm..Which is a great idea!
xalu gravatar image xalu  ( 2011-07-27 02:44:48 -0600 )edit

xalu and qweeq, I wonder if you or anybody else made progress with OWI's 535 Robotic Arm Edge in ROS and OpenCV with 2 cameras. Would you share the code?

sd gravatar image sd  ( 2014-04-24 10:53:06 -0600 )edit

Question Tools

2 followers

Stats

Asked: 2011-07-09 02:13:17 -0600

Seen: 3,350 times

Last updated: Jul 15 '11