ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question

transformed coordinates negated and in wrong order

asked 2015-08-26 08:29:06 -0500

lteacy gravatar image

I'm trying to get the example code here: ... to project object positions from gazebo into an image frame.

To get it to work correctly, I had to introduce a horribly hack - negating and rearranging the order of the coordinations on line 63:

  • cv::Point3d pt_cv(pt.x(), pt.y(), pt.z());
  • cv::Point3d pt_cv(-pt.y(), -pt.z(), pt.x());

Obviously, this is less than satisfactory, and I'm sure I'm doing something wrong up stream. Any suggestions as to how I might identify the source of the problem would be very much appreciated.


edit retag flag offensive close merge delete

2 Answers

Sort by ยป oldest newest most voted

answered 2015-08-26 10:20:03 -0500

dornhege gravatar image

updated 2015-08-26 10:20:24 -0500

This looks very much like you're mixing up the camera frame and the optical frame. They are two different coordinate systems. Usually a driver launch script (e.g. for kinect) provides the transformation between both. If you apply that it will perform your horrible hack in a controlled manner.

edit flag offensive delete link more


works now - thanks!

lteacy gravatar image lteacy  ( 2015-08-28 13:34:07 -0500 )edit

answered 2015-08-26 10:23:29 -0500

This sounds a lot like a mix-up between the ROS standard frame convention and the special convention used for optical frames as described in REP-0103 (look at the "suffix frames") part. Normally, the setup with two frames for a camera allows easy conversion via tf.

edit flag offensive delete link more


Many thanks to both - this does appear to the be the issue. Works perfectly now (without the hack ;-)

lteacy gravatar image lteacy  ( 2015-08-28 13:33:38 -0500 )edit

Question Tools

1 follower


Asked: 2015-08-26 08:29:06 -0500

Seen: 329 times

Last updated: Aug 26 '15