An earlier post on photogrammetry models not having proper real-world scale includes a response from Apple that says:
Make sure you are actually reading these files in your code to create the PhotogrammetrySample. If you simply keep the *_depth.tiff files in the input folder, but do not read the depthDataMap while creating the PhotogrammetrySample, then these depth files will not be used.
The sample photgrammetry app does not include creating a photogrammetry sample, only supplying a folder of images with depth data included. How do we go about "reading the DepthDataMap"? To be clear, my 3D models are not accurate to to real-world although I do have depth maps in the HEIC images and also in separate TIFs.