# Revision history [back]

### Visp_auto_tracker

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this?

### Visp_auto_tracker

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp

### Visp_auto_tracker

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with vispvisp, I don't understand where he is creating his TF.

### Visp_auto_tracker

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

### Visp_auto_trackerVisp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

Comment on jayess: So what I've tried is, understanding TF reading some tutorials. I looked into the github of the example I showed. Also there is a localization_tf in the visp_tracker but it is wrt to the map and not the camera. Also I found a package called external_camera_tf but it won't work on Indigo.. I'am very new to ros .

### Visp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

Comment on jayess: jayess:

So what I've tried is, understanding understanding TF reading some tutorials. I looked looked into the github of the example I I showed. Also there is a a localization_tf in the visp_tracker visp_tracker but it is wrt to the map and not the the camera. Also I found a package called called external_camera_tf but it won't work work on Indigo.. I'am very new to ros .

### Visp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

Comment on jayess:

So what I've tried is, understanding TF reading some tutorials. I looked into the github of the example I showed. showed but I don't understand where he is using a TF function . Also there is a localization_tf in the visp_tracker but it is wrt to the map and not the camera. Also I found a package called external_camera_tf but it won't work on Indigo.. I'am very new to ros .

### Visp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

TF. Github StereoColorTracking

Comment on jayess:

So what I've tried is, understanding TF reading some tutorials. I looked into the github of the example I showed but I don't understand where he is using a TF function . Also there is a localization_tf in the visp_tracker but it is wrt to the map and not the camera. Also I found a package called external_camera_tf but it won't work on Indigo.. I'am very new to ros .

### Visp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF. TF.

Comment on jayess:

So what I've tried is, understanding TF reading some tutorials. I looked into the github of the example I showed but I don't understand where he is using a TF function . Also there is a localization_tf in the visp_tracker but it is wrt to the map and not the camera. Also I found a package called external_camera_tf but it won't work on Indigo.. I'am very new to ros .

### Visp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

Github StereoColorTracking

Comment on jayess:

So what I've tried is, understanding TF reading some tutorials. I looked into the github of the example I showed but I don't understand where he is using a TF function . Also there is a localization_tf in the visp_tracker but it is wrt to the map and not the camera. Also I found a package called external_camera_tf but it won't work on Indigo.. I'am very new to ros .

So I created a new launch file to launch the visp_auto_tracker with 2 cams. As you can see I made 2 static transforms for 2 webcams. But when i go to RVIZ the object is moving wrt to the map and to the camera's...

<!-- -*- xml -*-
This launch file is made to launch the visp_auto_tracker with 2 camera's.
The file is created and used by Fontys Hogescholen Engineering project group 10 S7.
-->
<launch>
<!-- camera transforms -->
<node pkg="tf" type="static_transform_publisher"
name="front_cam_tf" args="0 0 1.0 3.14159265359 3.14159265359 1.570796326795 /map /front_camera 100"/>
<node pkg="tf" type="static_transform_publisher"
name="top_cam_tf" args="0.20 0 1.0 3.14159265359 3.14159265359 1.570796326795 /map /top_camera 100"/>

<!-- Second, launch the localization node. Parameters define the object position w.r.t. the world frame -->
<node pkg="visp_tracker" type="tf_localization.py" name="tf_localization">
<param name="object_translation_x" value="1.8442106300000001" />
<param name="object_translation_y" value="-0.0083684399999999996" />
<param name="object_translation_z" value="0.52310595000000004" />

<param name="object_translation_qx" value="0.04655744" />
<param name="object_translation_qy" value="-0.12974845" />
<param name="object_translation_qz" value="-0.45632887" />
<param name="object_translation_qw" value="0.87906866" />
</node>

<!-- Launch the "front" tracking node -->
<node pkg="visp_auto_tracker" type="visp_auto_tracker" name="visp_auto_tracker1" output="screen">
<param name="model_path" value="$(find visp_auto_tracker)/models" /> <param name="model_name" value="pattern" /> <param name="debug_display" value="True" /> <remap from="/visp_auto_tracker1/camera_info" to="/usb_cam1/camera_info"/> <remap from="/visp_auto_tracker1/image_raw" to="/usb_cam1/image_raw"/> </node> <!-- Launch the "top" tracking node --> <node pkg="visp_auto_tracker" type="visp_auto_tracker" name="visp_auto_tracker2" output="screen"> <param name="model_path" value="$(find visp_auto_tracker)/models" />
<param name="model_name" value="pattern" />
<param name="debug_display" value="True" />
<remap from="/visp_auto_tracker2/camera_info" to="/usb_cam2/camera_info"/>
<remap from="/visp_auto_tracker2/image_raw" to="/usb_cam2/image_raw"/>
</node>

<!-- Launch the "front" usb camera acquisition node -->
<node pkg="usb_cam" type="usb_cam_node" name="usb_cam1" output="screen">
<param name="image_width" value="640" />
<param name="image_height" value="480" />
<param name="video_device" value="/dev/video2" />
<param name="pixel_format" value="yuyv" />
<param name="auto_focus" type="bool" value="False" />
<param name="camera_name" value="/camera/image_raw" />
<param name="camera_info_url" value="package://visp_auto_tracker/models/calibration_top.ini" type="string" />
</node>

<!-- Launch the "top"usb camera acquisition node -->
<node pkg="usb_cam" type="usb_cam_node" name="usb_cam2" output="screen">
<param name="image_width" value="640" />
<param name="image_height" value="480" />
<param name="video_device" value="/dev/video1" />
<param name="pixel_format" value="yuyv" />
<param name="auto_focus" type="bool" value="False" />
<param name="camera_name" value="/camera/image_raw" />
<param name="camera_info_url" value="package://visp_auto_tracker/models/calibration_front.ini" type="string" />
</node>
</launch>


https://imgur.com/a/37fkn

### Visp_auto_tracker how to create a TF

Hi, I want to use visp_auto_tracker with 2 camera's so we have a better positioning. But what i don't understand is how I create a TF for this.. Can some one help me out with writing a TF for this? For example : https://www.youtube.com/watch?v=0cQXU5X0Yyk But then with visp, I don't understand where he is creating his TF.

Github StereoColorTracking

Comment on jayess:

So what I've tried is, understanding TF reading some tutorials. I looked into the github of the example I showed but I don't understand where he is using a TF function . Also there is a localization_tf in the visp_tracker but it is wrt to the map and not the camera. Also I found a package called external_camera_tf but it won't work on Indigo.. I'am very new to ros .

So I created a new launch file to launch the visp_auto_tracker with 2 cams. As you can see I made 2 static transforms for 2 webcams. But when i go to RVIZ the object is moving wrt to the map and to the camera's...

<!-- -*- xml -*-
This launch file is made to launch the visp_auto_tracker with 2 camera's.
The file is created and used by Fontys Hogescholen Engineering project group 10 S7.
-->
<launch>
<!-- camera transforms -->
<node pkg="tf" type="static_transform_publisher"
name="front_cam_tf" args="0 0 1.0 3.14159265359 3.14159265359 1.570796326795 /map /front_camera 100"/>
<node pkg="tf" type="static_transform_publisher"
name="top_cam_tf" args="0.20 0 1.0 3.14159265359 3.14159265359 1.570796326795 /map /top_camera 100"/>

<!-- Second, launch the localization node. Parameters define the object position w.r.t. the world frame -->
<node pkg="visp_tracker" type="tf_localization.py" name="tf_localization">
<param name="object_translation_x" value="1.8442106300000001" />
<param name="object_translation_y" value="-0.0083684399999999996" />
<param name="object_translation_z" value="0.52310595000000004" />

<param name="object_translation_qx" value="0.04655744" />
<param name="object_translation_qy" value="-0.12974845" />
<param name="object_translation_qz" value="-0.45632887" />
<param name="object_translation_qw" value="0.87906866" />
</node>

<!-- Launch the "front" tracking node -->
<node pkg="visp_auto_tracker" type="visp_auto_tracker" name="visp_auto_tracker1" output="screen">
<param name="model_path" value="$(find visp_auto_tracker)/models" /> <param name="model_name" value="pattern" /> <param name="debug_display" value="True" /> <remap from="/visp_auto_tracker1/camera_info" to="/usb_cam1/camera_info"/> <remap from="/visp_auto_tracker1/image_raw" to="/usb_cam1/image_raw"/> </node> <!-- Launch the "top" tracking node --> <node pkg="visp_auto_tracker" type="visp_auto_tracker" name="visp_auto_tracker2" output="screen"> <param name="model_path" value="$(find visp_auto_tracker)/models" />
<param name="model_name" value="pattern" />
<param name="debug_display" value="True" />
<remap from="/visp_auto_tracker2/camera_info" to="/usb_cam2/camera_info"/>
<remap from="/visp_auto_tracker2/image_raw" to="/usb_cam2/image_raw"/>
</node>

<!-- Launch the "front" usb camera acquisition node -->
<node pkg="usb_cam" type="usb_cam_node" name="usb_cam1" output="screen">
<param name="image_width" value="640" />
<param name="image_height" value="480" />
<param name="video_device" value="/dev/video2" />
<param name="pixel_format" value="yuyv" />
<param name="auto_focus" type="bool" value="False" />
<param name="camera_name" value="/camera/image_raw" />
<param name="camera_info_url" value="package://visp_auto_tracker/models/calibration_top.ini" type="string" />
</node>

<!-- Launch the "top"usb camera acquisition node -->
<node pkg="usb_cam" type="usb_cam_node" name="usb_cam2" output="screen">
<param name="image_width" value="640" />
<param name="image_height" value="480" />
<param name="video_device" value="/dev/video1" />
<param name="pixel_format" value="yuyv" />
<param name="auto_focus" type="bool" value="False" />
<param name="camera_name" value="/camera/image_raw" />
<param name="camera_info_url" value="package://visp_auto_tracker/models/calibration_front.ini" type="string" />
</node>
</launch>


https://imgur.com/a/37fkn