Object Tracker, /roi and spiky /cmd_vel/angular/z

asked 2019-10-03 05:49:24 -0500


I hope I can get a hint/help on how to proceed with my problem. I am working on a custom DIY 4wd robot using Linorobot on a raspberry that finds an object and should simply turn in the direction of the object to center the object in the camera frame.

Following the book "robot by example vol. 1" I have an object detector that publishes a region of interest on the topic /roi as a sensor_msgs/RegionOfInterest.msg each time an object was detected.

My object detector uses TensorFlow and publishes a /roi only ~ once per second. Using the object_tracker.py code, my robot turns/tracks the object very slow. It looks like that the published /roi in a rate of once per second is not sufficient to make my robot turn.

The object follower from that book subscribes to the /roi and publishes /cmd_vel commands on /cmd_vel/angular/z to turn the robot.

The rate on "# How often should we update the robot's motion?" is set to 10Hz.

self.rate = rospy.get_param("~rate", 10)

Checkingrqt_plot /cmd_vel/angular/z, I see that the angular velocity is being published in a small spikes pattern, comparable to a PWM duty-cycle of for example 10%.

This spike like pattern will not make my robot turn.

My robot only turns when I send cmd_vel commands with a higher frequency, like when using teleop keyboard and holding the turnkey for some time to get a duty cycle like pattern of 100% for a short time, so publishing cmd_vel constantly.

Changing max_rotation_speed, min_rotation_speed, gain etc does not solve the problem.

My questions are:

  • What is the best way to make the robot turn at least some degree even when the /roi message only comes in every second? At the moment it needs around 15-20 cycles to make a 15° turn.

  • Is there an easy way to adapt the current code to turn its heading using /odom /imu? (Somethink like in link1?) e.g.

  • Set heading of the robot to 0°

  • An object is found in the image at 15°
  • Turn robot to 15° (regardless if /roi is published?)

Can somebody point me in the right direction what would be the best way to track an object and how to implement it?

In ros2opencv2.py the /roi is published: https://github.com/pirobot/rbx1/blob/...

link1: http://www.theconstructsim.com/ros-qa...

Raspbian Buster - ROS Melodic - Linorobot

edit retag flag offensive close merge delete


Hi, is this related to image based visual servoing? Do you know how to do it in ROS-Gazebo? or if you can just share the code used to get the roi of the image using tensorflow object detection? It be very helpful. Thanks!

Anukriti gravatar image Anukriti  ( 2021-10-12 03:16:25 -0500 )edit