Is it possible to track a custom ROI with visp?
Hi community!
I'm new to Visual Servoing
and I'm managing to get my robot track me! That sounds cool!
For instance I've developped a tracking system (in the image itself) giving me a ROI as output, it is now able to draw a bounding box on me each time.
But how to convert the way it sees me to motion commands? Theoretically , the control_law
is something like:
if I'm not in the image center ,the robot should try to get me centered by doing some rotation movements and if I'm too small it will go forward to get me bigger or in pre-intered desired size of me referring to a desired distance from it etc.
But how to get this done with code? How to use the visp software tools and especially the ROS package to achieve this?
I've been looking to the package code and to demo_pioneer example but I'm still confused about how to get this done not to track a QR code or a template but a random ROI and using my own tracking module !?
I would be grateful if someone gives me an overview of the package and how does it work, Is it better to use it for this task or to start from scratch. Any other suggestions of tools , links and packages that may help are welcome!
Thanks,
Jasmine.
Asked by Jasmin on 2017-10-30 06:21:23 UTC
Comments
I'm not willing in any case to wear a T-shirt with a huge QR code on it, no way :D!
Asked by Jasmin on 2017-10-30 11:07:24 UTC
Did you try first to make it move based on QR code? if so share it please and can you tell me what kind of robot are you using ?
Asked by Abdu on 2018-01-26 10:57:42 UTC
I was using the Mini-Lab robot but I didn't carry on with visp. Since I had a laser on the robot, the leg_tracker package made the person following task much easier.
Asked by Jasmin on 2018-01-27 11:43:36 UTC
If you want your robot to track a QR code, you may be interested in the demo_pioneer package, It is a complete example of how to use visp to detect a QR code and send the suitable commands to the robot to track it.
Asked by Jasmin on 2018-01-27 11:53:00 UTC
I am using aruco marker in order to control the motion of ur5 arm , I can detect the marker's pose and the end-effector in rviz. But I don't know how to make the end-effector track the marker. I mean the end-effector should subscribe to the pose of the marker
Asked by Abdu on 2018-01-27 15:23:27 UTC
I've just seen your question. I've never worked with similar hardware and software before, but I'll try to look around and tell you if I understand how to do it. Cause I'm a little bit curious about it :).
Asked by Jasmin on 2018-01-28 03:46:24 UTC