ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
0

How to use ViSP example for visual servoing ?

asked 2018-02-02 17:44:32 -0600

Abdu gravatar image

Hi ROSSER

I have been struggling for a while with this issue,

I want to do visual servoing with real UR5 arm, I can control UR5 with ur_modern_driver properly and I can use visp_auto_tracker to detect the QR-code and publish the pose. Until now, everything works fine with ROS indigo.

I prepared a ROS node that can subscribe to the published pose, but I do not know how to use PBVS or IBVS (ViSPexamples) ???, and then publish the velocity to the UR5 driver ???. I can run this PBVS-example alone (which can calculate the errors and camera velocity) but I don't know how to feed it with my real QR-code. I also spent time with pioneer-example, but this is only for mobile robot, I couldn't use it for UR5 arm. I mean, it gives linear and angular velocity, but I need the position and orientation for the end-effector of the UR5.

The part the I am stuck is, I don't know how to feed PBVS-example with my real QR-code (see below) in order to calculate camera velocity based on my real input.

Thanks in advance

This is my ROS node

//c++ example to subscribe to an QR-code's pose that is published by visp_auto_tracker

#include <vector>

#include "ros/ros.h"
#include "geometry_msgs/PoseStamped.h"

std::vector<geometry_msgs::PoseStamped::ConstPtr> poses;

void handle_poses(const geometry_msgs::PoseStamped::ConstPtr& msg)
{
  ROS_INFO_STREAM("Position X " << msg->pose.position.x);
  ROS_INFO_STREAM("Position Y: " << msg->pose.position.y);
      ROS_INFO_STREAM("Position Z: " << msg->pose.position.z);

  ROS_INFO_STREAM("Orientation X: " << msg->pose.orientation.x);
  ROS_INFO_STREAM("Orientation Y: " << msg->pose.orientation.y);
  ROS_INFO_STREAM("Orientation Z: " << msg->pose.orientation.z);
  ROS_INFO_STREAM("Orientation W: " << msg->pose.orientation.w);
  // Use the msg object here to access the pose elements,
  // like msg->pose.pose.position.x
  poses.push_back(msg);
}

int main(int argc, char **argv)
{
  ros::init(argc, argv, "listenerccp");

  ros::NodeHandle n;

  ros::Subscriber sub = n.subscribe("/visp_auto_tracker/object_position",
    1000, handle_poses);

  ros::spin();

  return 0;
} '
edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted
2

answered 2018-02-04 20:18:49 -0600

Hansondon gravatar image

updated 2018-02-07 05:14:25 -0600

First of all, I would suggest that you go through the example here under the Detailed Description section because it is a minimum viable example. We got our UR3 to work with ViSP based on this example. If your camera is not on the end effector, you have to change the task type to some EYETOHAND type. Otherwise, you should use EYEINHAND type. In our case, EYETOHAND_L_cVe_eJe is used, so we need to build features from the desired end-effector pose, supply current end effector pose and read current Jacobian in order for ViSP to calculate the velocity command for all the joints (not the end effector velocity).

Then you need to understand that ViSP is a general purpose library that is not designed only for ROS. So it's your work to bridge ViSP and ROS. We used tf to get both the desired and current end-effector pose and converted them to ViSP homogeneous matrix with visp_bridge (BTW, we are not using any ViSP visual tracking algorithm. So in your case, you may not need to get the desired end-effector pose from tf). For the Jacobian, you can get it from the getJacobian function of MoveIt!. Of course, there might be other ways to feed this information. But that's enough for us to get joint velocities from ViSP.

At last, since you are also using ur_modern_driver, to send the velocity command to the robot you may simply publish it to the /ur_driver/joint_speed topic.

edit flag offensive delete link more

Comments

Thank you for your reply, @Hansondon I only want to make ur5 move when the web-cam camera is attached on the end-effector, with ibvs or pbvs, both are ok for now. Do you think there is any example that I can run it and feed it with my own input?

Abdu gravatar image Abdu  ( 2018-02-06 08:59:17 -0600 )edit

I think it would be difficult to find an example you can use directly, unless you have exactly the same setup. But the example I mentioned in the post is really easy to adapt.

Hansondon gravatar image Hansondon  ( 2018-02-07 05:22:29 -0600 )edit

I checked the example but I don't know yet how to implemented with ros. my concern right now is to run any visual servoing example with ros and publish the joints velocities

Abdu gravatar image Abdu  ( 2018-02-07 11:11:05 -0600 )edit

Hello, I am also stuck in using visp for controlling an arm. Abdu, have you been successful so far to do that

  • "run any visual servoing example with ros and publish the joints velocities".
KARIM gravatar image KARIM  ( 2018-06-05 07:09:07 -0600 )edit

@KARIM, NO I haven't done it yet, I was busy with something else. I could only run visp object tracker and publish a target position in ROS and also subscribe to it. BUT I don't know how to run any visp example for calculating 6-joints velocities based on my input. What robot u hv ? where r u stuck?

Abdu gravatar image Abdu  ( 2018-06-05 08:44:05 -0600 )edit

@Abdu , to publish joint velocities using ur_modern_driver , you need to do publish to the rostopic /joint_group_vel_controller/command.

Prem Raj gravatar image Prem Raj  ( 2019-11-07 00:57:24 -0600 )edit

Hi everyone, I am working on a servoing application but with a different robot. I have implemented the control laws through ViSP and now I want to close the feedback control loop through a JointVelocityController. How can I implement this kind of controller and to which topics should I publish the velocity? Thanks in advance.

lucamark gravatar image lucamark  ( 2021-07-14 15:55:03 -0600 )edit

Question Tools

3 followers

Stats

Asked: 2018-02-02 17:44:32 -0600

Seen: 2,313 times

Last updated: Feb 07 '18