Posts

Post marked as solved
2 Replies
2.8k Views
The iPad Pro 2020 includes a LiDAR sensor that promises precise distance measurements in the form of "depth points" [Apple - Newsroom - Press Release - 2020-03-18 - Apple unveils new iPad Pro with breakthrough LiDAR Scanner and brings trackpad support to iPadOS].I am looking for direct real time access to the depth data, "depth points", from the LiDAR sensor at full frame rate in the same way the depth data is provided from the TrueDepthCamera on the front of the iPad Pro and other iOS and iPadOS devices.However, I have not been able to determine a method for retrieving the LiDAR depth data in an efficient way. As a result I am reaching out for clarification: is the LiDAR depth data directly available and if so how can it be accessed?Apple Developer Resources for LiDARThe following Sample Code: Visualizing and Interacting with a Reconstructed Scene, appears to be the only developer resource currently provided for utilizing the LiDAR depth data.In the VisualizingSceneSemantics example, in the ViewController, the session callback function provides access to the current ARFrame. The ARFrame is also accessible from currentFrame on the ARSession property.An ARFrame provides a AVDepthData property, capturedDepthData, however this depth data appears to always be nil or limited to a certain scenario, not consistently available.However, the raycast function on the ARView can be used to find the world position for a x and y coordinate in the frame. But the raycast function appears to be computationally expensive and not practical for accessing more than a few points per frame.In addition the color frame from the ARFrame is not in the color format desired for my use. As a result an expensive transformation is required to change the format on each frame.AVFoundation - AVCaptureDeviceTrueDepthCamera, DualWideCamera, TimeOfFlightCamera? AVFoundation exposes methods for handling camera sensor color and depth data. The following Sample Code - Streaming Depth Data from the TrueDepth Camera provides an example and demonstration of the depth data provided by the TrueDepthCamera and the DualWideCamera. However, the LiDAR sensor depth data is notably missing from the AVFoundation methods for managing sensor data. Strangely, the Sample Code - Streaming Depth Data from the TrueDepth Camera, works on the iPad Pro 2020 for the front camera, but when the AVCaptureDevice is switched to the back DualWideCamera it no longer offers depth data. The same code will however work on the iPhone 11, the DualWideCamera provides depth data. Interestingly the Sample Code: Visualizing and Interacting with a Reconstructed Scene debug log reveals the following: "ARDepthSensor (AVCaptureDeviceTypeBuiltInTimeOfFlightCamera - Back)". The form of AVCaptureDeviceTypeBuiltInTimeOfFlightCamera matches the form of other AVCaptureDevice property names: AVCaptureDeviceTypeBuiltInDualWideCamera and AVCaptureDeviceTypeBuiltInTrueDepthCamera. However AVCaptureDevice.DeviceType does not appear to contain a builtInTimeOfFlightCamera as of iPadOS 13.5.ConclusionIdeally AVFoundation AVCaptureDevice.DeviceType would expose builtInTimeOfFlightCamera that would provide access to the LiDAR depth data in the same way as the TrueDepthCamera and the DualWideCamera. Is this being considered, is there a better method?
Posted
by ngingric.
Last updated
.