Depth Scale -- Photogrammetry

An earlier post on photogrammetry models not having proper real-world scale includes a response from Apple that says:

Make sure you are actually reading these files in your code to create the PhotogrammetrySample. If you simply keep the *_depth.tiff files in the input folder, but do not read the depthDataMap while creating the PhotogrammetrySample, then these depth files will not be used.

The sample photgrammetry app does not include creating a photogrammetry sample, only supplying a folder of images with depth data included. How do we go about "reading the DepthDataMap"? To be clear, my 3D models are not accurate to to real-world although I do have depth maps in the HEIC images and also in separate TIFs.

Are you using CaptureSample](https://developer.apple.com/documentation/realitykit/taking_pictures_for_3d_object_capture) app to capture this HEIC data? The depth data is embedded into the *.HEIC files, and you do not need to explicitly read the depth data.

If HEIC files captured withCaptureSample app is producing models with incorrect scale, can you share an example HEIC data bundle, along with the scale of object?

If you're not using CaptureSample app and generating .tiff files by yourself, you can refer to following documentations on how to use them in PhotogrammetrySample: PhotogrammetrySample, depthDataMap.

Yes, that is how I am capturing the images. I think I figured out my issue. I am taking photos under water and I believe the depth data is off somewhat due to the refractive differences of water vs air. I assume that in Portrait Mode two lenses are used to compute a depth map, and the math is a little off for underwater use.

Depth Scale -- Photogrammetry
 
 
Q