# MATLAB 16uc1 or 32fc1 Conversion

I have some image processing code in MATLAB and am attempting to use it with ROS. I read in a PointCloud2 object and execute `readRGB(ptcloud)`

and `readXYZ(ptcloud)`

where I get two MATLAB images each of size 480x640x3.

**Q1: I assume that a depth image of size NxMx3 gives X, Y, and Z distances as the three channels. Is this correct??**

I then execute my processing and need to send a depth image back encoded in either 16uc1 or 32fc1. I have not found a built-in MATLAB function for this. I can write a conversion script myself but I am unfamiliar with these formats.

**Q2: What is the general algorithm to convert my NxMx3 matrix into either format?**

My depth camera node publishes a depth image in the 16uc1 format. When I subscribe to that in MATLAB and execute `readImage(ros_image)`

I get an image of size 480x640x1 in MATLAB but when I execute `rostopic echo /ros_depth_image_topic`

I see the data is given as "data: array: type uint8, length: 614400".

**Q3: I notice this is 480 6402 so why does MATLAB only read in a 480x640x1 image? How do the 614400 data points translate to only 307200 data points in MATLAB?**

Thank you for any assistance offered!

================================================================================

Update 1:
I went back over REP-118 -- Depth Images and understand the the distance in a depth image is the distance from the camera's z-axis. Therefore why would MATLAB's `readXYZ(ptcloud)`

function give me a 3 channel image when a depth image is only 1 channel?
Also: the 16UC1 states here that it supports int16 but I believe ROS wants uint16. When I get 2x the data represented as UInt8 does that just mean there are two 8-bit words basically being concatenated into one 16-bit data point? In other words, is each pixel represented as two UInt8 data points in the 16UC1?