ROS Resources: Documentation | Support | Discussion Forum | Index | Service Status | ros @ Robotics Stack Exchange
Ask Your Question
1

Transformation between two coordinate frames (tf)

asked 2023-05-08 14:22:27 -0500

infraviored gravatar image

updated 2023-05-19 07:59:36 -0500

PROBLEM SOLVED

The previous issue arose due to an incorrect configuration of the tracepen matrix, which was not right-handed. However, I have successfully resolved this problem by extracting the rotation and adjusting the orientation of the x and y axes accordingly. Consequently, the correct origin has been determined, and it remains unaffected by the position of the calibration socket.

Moreover, this updated function incorporates improved error handling mechanisms to ensure smoother operation.

The function:

def tracepenMatrix(self):
    """
    Returns the transformation matrix of the tracepen device as a homogeneous 4x4 matrix.

    This method retrieves the pose matrix from the controller, converts it into
    a numpy array, and inserts it into the first three rows of a 4x4 identity matrix.
    The resulting matrix represents the pose of the tracepen in homogeneous coordinates.

    The transformation matrix consists of a 3x3 rotation matrix and a 1x3 translation vector. 
    The rotation matrix is flipped along the z-axis to change the frame handedness.

    Returns:
        np.ndarray or None: A 4x4 homogeneous transformation matrix representing the pose of the tracepen.
                            If the tracepen device is not available, it returns None.

    Raises:
        TypeError: If the pose matrix from the controller is not properly formed.
    """
    try:
        # Fetch pose matrix from controller device
        raw_tracepen_matrix = np.array(list(self.v.devices["controller_1"].get_pose_matrix()))
    except KeyError:
        print("Controller not available. Sure that the tracepen is turned on?")
        return None
    except TypeError:
        print("Tracepen not available")
        print("Is SteamVR running?")
        return None

    # Create 4x4 identity matrix
    tracepen_matrix = np.eye(4)

    # Set the translation vector in the matrix
    tracepen_matrix[:3, 3] = raw_tracepen_matrix[:3, 3]

    # Create z flip matrix to mirror along the z axis and change the frame handedness
    flip_z_matrix = np.diag([-1, -1, 1])

    # Extract rotation component of the raw matrix, apply z-flip and set in the tracepen matrix
    tracepen_matrix[:3, :3] = raw_tracepen_matrix[:3, :3] @ flip_z_matrix

    return tracepen_matrix

Many thanks for your help! I am so glad that this is finally solved.


PROGRESS. 90% THERE!

The problem is so stupid i am ashamed of sharing where it came from.

Additionally: is the triad_vr you are using a local customised version? Searching for getPoseQuaternion() only returns your two posts on SO and this ROS Answers post.

Yep. That is it. To my defense, i got the workspace from my supervisor and nerver questined that the base is faulty. But it is.

getPoseQuaternion() should not exist. The name alone is suspicious.

A pose has 6DOF. Translation AND rotation. A quaternion has 3DOF. The implementation was complete nonsense and also not necessary at all.

triad_vr comes with get_pose_matrix() which i want anyways for furrther calculation of the calibration matrix.


With this, things look much better.

I changed the socket orientation to support the tracepen as i want it. The tracepen axis are:

  • z axial, positive towards the tip
  • x is towards right
  • y down (right handed)

Now i have the following behaviour:

  • Moving the (real) pen in z direction moves it in z direction. Rotations around z are also as expected.
  • Moving in +x (real) gives ...
(more)
edit retag flag offensive close merge delete

Comments

Is there any approach i can try? Anything related? I must make progress.

infraviored gravatar image infraviored  ( 2023-05-09 03:55:32 -0500 )edit

Same Q asked on https://robotics.stackexchange.com/qu.... Actually, reviewing https://wiki.ros.org/Support it's unclear if duplicating the same Q on different domains "violates" any rule. Just FYI active maintainers on this site has preferred not duplicating even on different sites.

130s gravatar image 130s  ( 2023-05-12 05:10:00 -0500 )edit

@infraviored: have you taken a step back and verified the pose of your 'tracepen' is actually correctly reported to your ROS application? I'd agree with the comments by @Mike Scheutzow and verify chirality of the tracepen's system (should be in the documentation somewhere I'd assume) in addition to making sure the data reported by the tracepen is compatible with how TF represents poses and rotations.

So: right handed, X+ forward, Y+ left, Z+ up (body local frame). Orientations are quaternions, with xyzw order. Distances are metres, rotations in radians. See also REP-103: Standard Units of Measure and Coordinate Conventions.

It'd be better to start with known good data than to try and hack your way around it later.

You should be able to check all this with some simple experiments combined with the documentation of your device(s), which would not depend on any kind of calibration.

gvdhoorn gravatar image gvdhoorn  ( 2023-05-13 01:43:14 -0500 )edit
1

In addition to what @130s reported, it seems there are multiple concurrent discussions about this going on. Even on SO there are already duplicates: ROS: Transform one CS into another one.

Cross-posting is not very nice, even if you are "desperate".

gvdhoorn gravatar image gvdhoorn  ( 2023-05-13 01:46:54 -0500 )edit
1

Additionally: is the triad_vr you are using a local customised version? Searching for getPoseQuaternion() only returns your two posts on SO and this ROS Answers post.

The function doesn't exist in TriadSemi/triad_openvr. It does have get_pose_quaternion() though. This calls convert_to_quaternion(..), which seems to return [x,y,z,r_w,r_x,r_y,r_z]. Provided that's all correct, that would mean the quaternion order it returns is wxyz, which would seem to make this correct (your conversion), at least when it comes to the indices.

Whether the chirality of triad_vr matches ROS I can't say.

gvdhoorn gravatar image gvdhoorn  ( 2023-05-13 01:59:47 -0500 )edit
1

Something else to check: even if the chirality matches, that doesn't mean the axes match between ROS and triad_vr match. Unexpected translations -- and rotations -- could be explained by mismatching axes (fi: triad_vr's X is actually ROS' Y).

gvdhoorn gravatar image gvdhoorn  ( 2023-05-13 02:07:58 -0500 )edit

Thank you so much for all of these responses! I can't wait to meet my robot again on Tuesday. If I could, i would skip the weekend and directly go there. I can not wait to finally figure out this bug, it's haunting me in my sleep.

infraviored gravatar image infraviored  ( 2023-05-14 12:46:44 -0500 )edit

Oh my god. You saved me. The triad_vr was customized / broken.

Now the only thing remianing is inverting x and y axis.

infraviored gravatar image infraviored  ( 2023-05-16 11:55:42 -0500 )edit

1 Answer

Sort by ยป oldest newest most voted
1

answered 2023-05-09 09:45:29 -0500

Mike Scheutzow gravatar image

The approach you are using is not valid. Your transform tree could look like:

world->tracepen

To be clear, this is the transform from the world-origin to the tracepen-origin. The pose of the device is reported in the tracepen frame. Based on the information you provided, this device pose does not need a frame of its own.

During calibration, you need to calculate the world->tracepen transform. To calculate this, you need the transform from world to the socket (in the world frame), and you need the current pose of the device (in the tracepen frame.)

Once you are publishing the world->tracepen transform, you can then use the tf2_ros.Buffer.transform() method to convert any pose of the device into the world frame.

edit flag offensive delete link more

Comments

Thank you so much for answering me! Rellay, SO much. You read all this and apparently already know my issue.

However, I still struggle a bit to fully understand your response.

So i understand why my old approach is wrong. But Still not how to get the tf(world,tracepen) from tf(world,socket) and tf(tracepen_origin,tracepen_now)

I add some structured answer below. Maybe you can tell me in which step the error is.

infraviored gravatar image infraviored  ( 2023-05-11 04:51:47 -0500 )edit

I think i almost did what you suggested now, the systems coincide right after calibration, but moving causes unexpected behaviour.

infraviored gravatar image infraviored  ( 2023-05-11 10:04:37 -0500 )edit
1

matrix_world_tracepenOrigin = matrix_world_tracepenPlugged@ matrix_tracepenPlugged_tracepenOrigin

Is this "@" a matrix multiply? If yes, then this equation is correct assuming you have correctly calculated the second term (i.e. the transform inverse of rawPluggedPose.)

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-11 11:05:59 -0500 )edit

yes, it is a matrix multiply. I calculated it by publishing tf(tracepenOrigin,tracepenPlugged), then looking up tf(tracepenPlugged,tracepenOrigin), then converting to a matrix. And compared it with directly calculating the inverse, it is identical.

And i also think the approach is valid now, but how can it be that after calibration, tracepen frame moves in the wrong direciton anyways?

infraviored gravatar image infraviored  ( 2023-05-11 11:21:31 -0500 )edit
1

That sounds correct, although it's completely unnecessary to publish a transform just to get its inverse. See the answer to #q229329.

Please update the code at the top of your post, and also show the code where you convert a tracepen pose to the world frame.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-11 11:52:41 -0500 )edit

You are right, i will change this later on.

What you mean "at the top of my post"?

I just prettied up the full code, kicked everything unnecessary out, uploaded it on GitHub and also appended it to the post.

infraviored gravatar image infraviored  ( 2023-05-11 13:25:23 -0500 )edit
1

after moving the socket x+0.2 (tracepen still plugged), the tracepen moves in another direction. It should still be in the socket.

I don't understand this experiment. Why would you expect the transform world->socket to change just because you physically move the socket somewhere else?

Have you verified the orientation returned by tracepenPose() is accurate in the tracepen frame? The quaternion value shuffle you do in there looks questionable to me. I would try to verify this by printing the roll/pitch/yaw euler angles while I physically change the pen orientation.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-12 07:17:49 -0500 )edit
1

Another thought: have you checked that the tracepen frame is a right-handed coordinate frame? If it's not, you must rename the axes so it is, and make the proper adjustments to the reported translation and orientation values.

Mike Scheutzow gravatar image Mike Scheutzow  ( 2023-05-12 07:54:00 -0500 )edit

Question Tools

2 followers

Stats

Asked: 2023-05-08 14:22:27 -0500

Seen: 349 times

Last updated: May 19 '23