Robotics StackExchange | Archived questions

Object Tracker, /roi and spiky /cmd_vel/angular/z

Hi,

I hope I can get a hint/help on how to proceed with my problem. I am working on a custom DIY 4wd robot using Linorobot on a raspberry that finds an object and should simply turn in the direction of the object to center the object in the camera frame.

Following the book "robot by example vol. 1" I have an object detector that publishes a region of interest on the topic /roi as a sensor_msgs/RegionOfInterest.msg each time an object was detected.

My object detector uses TensorFlow and publishes a /roi only ~ once per second. Using the object_tracker.py code, my robot turns/tracks the object very slow. It looks like that the published /roi in a rate of once per second is not sufficient to make my robot turn.

The object follower from that book subscribes to the /roi and publishes /cmdvel commands on /cmdvel/angular/z to turn the robot.

The rate on "# How often should we update the robot's motion?" is set to 10Hz.

self.rate = rospy.get_param("~rate", 10)

Checkingrqtplot /cmdvel/angular/z, I see that the angular velocity is being published in a small spikes pattern, comparable to a PWM duty-cycle of for example 10%.

This spike like pattern will not make my robot turn.

My robot only turns when I send cmdvel commands with a higher frequency, like when using teleop keyboard and holding the turnkey for some time to get a duty cycle like pattern of 100% for a short time, so publishing cmdvel constantly.

Changing maxrotationspeed, minrotationspeed, gain etc does not solve the problem.

My questions are:

Can somebody point me in the right direction what would be the best way to track an object and how to implement it?

In ros2opencv2.py the /roi is published: https://github.com/pirobot/rbx1/blob/indigo-devel/rbx1_vision/src/rbx1_vision/ros2opencv2.py

link1: http://www.theconstructsim.com/ros-qa-135-how-to-rotate-a-robot-to-a-desired-heading-using-feedback-from-odometry/

Raspbian Buster - ROS Melodic - Linorobot

Asked by arminf82 on 2019-10-03 05:49:24 UTC

Comments

Hi, is this related to image based visual servoing? Do you know how to do it in ROS-Gazebo? or if you can just share the code used to get the roi of the image using tensorflow object detection? It be very helpful. Thanks!

Asked by Anukriti on 2021-10-12 03:16:25 UTC

Answers