Sensor Frame offset for robot_localization - Absolute odometer offset
EDIT - TLDR: If I have a singular source of absolute odometry providing x,y,z,roll,pitch & yaw (visual odom), how do I account for the camera's offset from base_link? Simply setting up a base_link -> zed_frame (zed is my camera) transform doesn't work, because the odometry data is published under the odom frame. EG: My camera is mounted at the center of my robot at a 45 degree pitch incline. If I move the robot in the x direction by 10 units, my odometry reports an offset of ( 10Sin(45) 0 10Cos(45) ) which gets fed to the filter and misplaces my odom -> base_link transform.
I have tried having an intermediate frame, zed_initial_frame. In this scenario my visual odometry is published under the zed_initial_frame, and a static tf with a rotation of 45 deg pitch links odom to zed_initial_frame. Now moving the robot 10 units in x results in the odometry pose being(10 0 0); HOWEVER, the base_link frame now gets reported in line with zed_initial_frame, i.e at 45 deg pitch. This is because at t (time) = 0, the odometry reports a 0 (eigen) transform. But this is then transformed to the odom frame, resulting in the initial odometry reading showing a 45 deg pitch. This is where I possibly need to point the finger at robot_localization, because even setting the odom0_relative parameter to true, doesn't fix this. Here is a the potential offending source code, my theory is that with the variable set to true, the first measurement is set to zero in section 7g, and only then transformed in 7h, resulting in a non-zero initial measurement after all due to the rotation.
Original Question: Hello, I have been stuck on this for days and am sure I am fundamentally missing something. I have read:
http://www.ros.org/reps/rep-0103.html
http://www.ros.org/reps/rep-0105.html
http://wiki.ros.org/robot_localization
http://answers.ros.org/question/23522...
http://answers.ros.org/question/23919...
http://answers.ros.org/question/23729...
http://answers.ros.org/question/21675...
http://library.isr.ist.utl.pt/docs/ro...
Yet I am unable to produce the results I want. My scenario:
- ZED Stereo Camera node ( https://github.com/stereolabs/zed-ros... ) - Provides pointcloud AND visual odometry
- elevation_mapping node ( https://github.com/ethz-asl/elevation... ) - Outputs filtered 2.5 cloud
- robot_localisation node ( http://wiki.ros.org/robot_localization ) - Filters visual odometry (apparently) and outputs odom
- A few minor nodes that generate the robot model, change odom to pose messages and so on.
Physically my set up is the ZED Camera mounted on a tripod at a 45 degree inclination downwards (pitch = 45). This is because I would like to see the floor in front of me. it is roughly 1.1m off the ground.
What I am trying to achieve: Get everything to line up in the visualizer in terms of transforms and frames. The ZED ouputs all data in the frame zed_tracked_frame, except for the zed_tracked_frame itself, which I have ...
I think I have found the cause of the issue, but am unsure who (which package) should be responsible for correcting it. I will edit my question accordingly.
If my obsolete odometer is offset from my base_link, how and who accounts for this offset?