Is it possible to access the AVDepthData from ARKit? The ARFrame contains the the camera's image, but not its depth.
I tried to create a separate AVCaptureSession to receive the AVDepthData, but I am unable to run the AVCaptureSession at the same time as ARKit.
Assuming this is a world-tracking session -- no.
ARKit only vends AVDepthData for Face Tracking sessions (via the front camera on iPhone X). It's not available with world tracking.
So, if I start up a Face Tracking session (via front camera on iPhone X), can I get access to the raw AVDepthData for processing within my own code?
Yes (from the front camera). It's a property on ARFrame.
Whats the resolution of this depth buffer?
Retrieving data ...