Depth data from ARKit

Is it possible to access the AVDepthData from ARKit? The ARFrame contains the the camera's image, but not its depth.


I tried to create a separate AVCaptureSession to receive the AVDepthData, but I am unable to run the AVCaptureSession at the same time as ARKit.

Replies

Assuming this is a world-tracking session -- no.

ARKit only vends AVDepthData for Face Tracking sessions (via the front camera on iPhone X). It's not available with world tracking.

So, if I start up a Face Tracking session (via front camera on iPhone X), can I get access to the raw AVDepthData for processing within my own code?

Yes (from the front camera). It's a property on ARFrame.

Whats the resolution of this depth buffer?

640x360

Can I also get the depth data from any object using the Iphone X fron camera?

What is the maximal depth that Iphone X is able to measure? What is the Error at this distance? And how many fps (frames per second) does it measure?