AVCapturePhotoOutput delivers filtered depth data although it's disabled

Hi All,


I am trying to retrieve UNfiltered/raw AVDepthData from AVCapturePhotoOutput. Although, I set explicitly filtering to NO through a AVCapturePhotoSettings object i.e. photoSettings.depthDataFiltered = NO;


The output after the capture session is a filtered depth map i.e. no NAN values.


Find code snippet below from AVCam code example from apple:


I don't know why AVCapturePhotoOutput ignores the no filter settings.


Appreciate any hint, thanks.


<code snippet>


photoSettings = [AVCapturePhotoSettings photoSettings];

photoSettings.highResolutionPhotoEnabled = YES;


if ( self.depthDataDeliveryMode == AVCamDepthDataDeliveryModeOn && self.photoOutput.isDepthDataDeliverySupported ) {

photoSettings.depthDataDeliveryEnabled = YES;

-> photoSettings.depthDataFiltered = NO;

} else {

photoSettings.depthDataDeliveryEnabled = NO;

}

:

:

:

self.inProgressPhotoCaptureDelegates[@(photoCaptureDelegate.requestedPhotoSettings.uniqueID)] = photoCaptureDelegate;

[self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate];

Replies

Hi.

The behavior you're observing is actually expected behavior for photos taken on the DualCamera. The explanation is long-winded, so get comfortable. 🙂


As explained in WWDC 2017 Session 507 (https://developer.apple.com/videos/play/wwdc2017/507/), the DualCamera registers RGB images from the wide and tele cameras and then calculates disparity between matching features (stereo vision). It takes a lot of horsepower to register wide and tele images and calculate disparities 30 times a second — thus, when using the AVCaptureDepthDataOutput to generate streaming depth, the highest resolution available is 320x240. This depth may be generated from sparse key points. Depth maps may have holes. But for stills (AVCapturePhotoOutput), the DualCamera is able to take more time and produce a higher quality image (up to 768x576). The DualCamera uses a different (higher quality, higher latency) algorithm for photos compared to DepthDataOutput. The higher quality still disparity generation is an all-or-nothing proposition — it does not produce sparse depth the way the real-time DepthDataOutput algorithm does. Thus, it never has any holes and there is no difference between filtered and unfiltered. This behavior is in contrast to the TrueDepth camera, which always has the fast, efficient IR disparity-gathering method at its disposal. The TrueDepth camera uses the same algorithm for both DepthDataOutput and PhotoOutput, and thus _does_ produce noticeable differences when you ask for filtered or unfiltered. Incidentally, the all-or-nothing aspect of the DualCamera’s still disparity calculation is reflected in the API as AVDepthData’s depthDataQuality property. If a certain threshold of matching keypoints cannot be detected between wide and tele images, the depth is tagged as “Low” quality rather than "High". Images with low quality depth are not considered good candidates for Apple’s portrait effects in Photos app.