Is it possible to access the raw wide angle camera images used in LiDAR depth maps?

I am trying to use ARkit to create a 3d .ply map of a room with LiDAR, similarly to how apps like 3D scanner app do. However, I would like to access the wide angle image frames that are captured throughout the process. I've read on various forums that this is not possible because apple has locked those images away from developers and they are only used for their algorithm that creates the depth map.

Is it possible to access these images, and if not is there a reason?

I'm assuming by "wide angle image frames" you are actually referring to the images from the ultra-wide lens.

If that is the case, you are correct that you cannot access that image in an ARSession, so I recommend that you file an enhancement request using Feedback Assistant.

That being said, the LiDAR Scanner has been recently made available in AVCapture via the builtInLiDARDepthCamera. Through AVCapture, you can set up an AVCaptureMultiCamSession configured to give you access to the builtInLiDARDepthCamera (which provides a wide angle image + LiDAR depth) and the builtInUltraWideCamera. If you don't require any of the tracking features that ARKit provides, then this is likely a better path to take anyway!

What about the intrinsic limitations on distance for the LiDAR used in the iPhone/IPad? It is not very useful for any outdoor capture at a distance. When will the LiDAR use tunable LEDs so we can use a Green beam?

Is it possible to access the raw wide angle camera images used in LiDAR depth maps?
 
 
Q