Depth data with Arkit

Hi,


I have an application that creates spherical panoramas using arkit, we want to improve the quality of the stitching by adding depth data ( https://developer.apple.com/documentation/avfoundation/avdepthdata ) per photo (the app takes 40 photos on iPhone X for a full sphere).

The problem is: using ARKit framework, we can't get depth data directly, so we need to use also AVFoundation, but ARKit blocks AVFoundation while working. Is there a workaround?

Replies

You can access the depth data using ARKit's ARFrame, retrieving the capturedDepthData property - https://developer.apple.com/documentation/arkit/arframe/2928208-captureddepthdata


This should provide the same AVDepthData information as AVFoundation would, at which point you could use the ARFrame's capturedImage property to perform any depth-based processing you deem relevant.

Hi,


I need to get the depth data from the back camera, not the front, what you sent, it uses the front-facing camera

  • var sceneDepth: ARDepthData? Data on the distance between a device's rear camera and real-world objects in an AR experience.

Add a Comment
Have you got depth data (distance) with ARFrame yet?

You mentioned that you are able to create - spherical panoramas using arkit. If you don't mind, Can you please guide me here how you able do that... Can you pls share some resources related to the same.

That would be great help. Thanks