Dual delivery with empty calibration data

Hi


I'm trying to play with dual delivery of my iPhone 8 plus.


I've watched videos from WWDC and it seems that the iPhone 8 plus is able to provide two synchronized frames only in the photo mode with AVCapturePhotoOutput


I succeeded to get the two frames from the telephoto and wide camera. For each frame, I'm interested in getting the intrinsic matrix.

I'm getting the intrinsic matrix in the method captureOutput:didFinishProcessingPhoto:error: in the delegate


The AVCapturePhoto has a property depthData which contains the cameraCalibrationData

For the first frame, the depthData and the cameraCalibrationData are available.

For the second frame (wide camera), there is no depthData available (nil)


Is this a bug ?

How can I get the intrinsic matrix for the second frame ?

Replies

Not a bug. Depth data is only available for the intersection of pixels between the two cameras being used to create stereo depth. In your case, the wide and the tele. We only have depth information for the tele field of view, so we only deliver a depthData with the tele image.


You can get intrinsics for both images if you opt in for cameraCalibrationDataDeliveryEnabled, in which case the cameraCalibrationData is delivered as a top-level property of the AVCapturePhoto, and it will be delivered for both photos in the virtualCameraConstituentPhotoDelivery capture.


Note that the properties named "dualXXXDelivery" have been deprecated and replacements provided: virtualCameraConstituentPhotoDelivery. This reflects the reality that we now have virtual cameras consisting of more than just two physical cameras. The Triple Camera on iPhone 11 and iPhone 11 Pro consists of tele, wide, and ultrawide.

Thanks for answering me


I did what you adviced me and it worked 🙂. I have the results I want directly in the property cameraCalibration data of AVCapturePhoto for both pictures. But I noticed something strange.


What I'm trying to do here, it is just taking synchronized pictures with associated intrinsics, distortion and extrinsics data for each frame.

So I'm interested only by the property cameraCalibrationData of AVCapturePhoto.


If I am not setting the property depthDataDeliveryEnabled to YES of AVCapturePhotoOutput, I will not get valid cameraCalibrationData for AVCapturePhoto (nil). I assume when this property is set to YES, the phone has to make some additional processings. I do not want to waste some resources for data I don't need, I have to do some processing by my own once I get the two frames. Maybe it is my code


The code when I'm configuring the AVCapturePhotoOutput


    AVCapturePhotoOutput* photoOutput = [[AVCapturePhotoOutput alloc] init];
    if ([self.session canAddOutput:photoOutput]) {
        [self.session addOutput:photoOutput];
        self.photoOutput = photoOutput;

        self.photoOutput.highResolutionCaptureEnabled = YES;
        self.photoOutput.livePhotoCaptureEnabled = NO;
        if ([videoDevice isVirtualDevice]){
            self.photoOutput.virtualDeviceConstituentPhotoDeliveryEnabled = YES;
        }
       
        BOOL depthDataDeliverySupported = self.photoOutput.depthDataDeliverySupported;
        self.photoOutput.depthDataDeliveryEnabled = depthDataDeliverySupported;
        NSLog(@"depthDataDeliverySupported %d", depthDataDeliverySupported);
        NSLog(@"depthDataDeliveryEnabled %d", self.photoOutput.depthDataDeliveryEnabled);
       
        BOOL cameraCalibrationDataDeliverySupported = self.photoOutput.cameraCalibrationDataDeliverySupported;
        NSLog(@"cameraCalibrationDataDeliverySupported %d", cameraCalibrationDataDeliverySupported);
    }


and the one when I'm configuring the AVCapturePhotoSettings


        NSDictionary * dict =@{
             (id)kCVPixelBufferPixelFormatTypeKey  :  [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA],
        };
        AVCapturePhotoSettings* photoSettings =  [AVCapturePhotoSettings photoSettingsWithFormat:dict];
        AVCaptureDevice * device = self.videoDeviceInput.device;
        if ([device isVirtualDevice]){
            // We have several constituent cameras
            photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices = [device constituentDevices];
            photoSettings.depthDataDeliveryEnabled = self.photoOutput.isDepthDataDeliveryEnabled;
            NSLog(@"Virtual %@ %d",
                  photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices,
                  photoSettings.depthDataDeliveryEnabled);
           
        }
       
        photoSettings.cameraCalibrationDataDeliveryEnabled = self.photoOutput.cameraCalibrationDataDeliverySupported;
        NSLog(@"cameraCalibrationDataDeliveryEnabled %d",  photoSettings.cameraCalibrationDataDeliveryEnabled);
       
       
        if (self.videoDeviceInput.device.isFlashAvailable) {
            photoSettings.flashMode = AVCaptureFlashModeOff;
        }
        photoSettings.highResolutionPhotoEnabled = YES;
        if (photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0) {
            photoSettings.previewPhotoFormat = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : photoSettings.availablePreviewPhotoPixelFormatTypes.firstObject };
        }

        //photoSettings.photoQualityPrioritization = AVCapturePhotoQualityPrioritizationSpeed;
       [self.photoOutput capturePhotoWithSettings:photoSettings delegate:self.capturePhotoDelegate];

Can you please tell me if I've done something wrong or if the system requires enabling the depth delivery ?


I was planning to support all iPhones supported by iOS 13 like iPhone SE (first generation). It has only one camera at the back. Today with this code I cannot get the cameraCalibrationData.


My program will also uses AVCaptureVideoDataOutput. When I'm using it with an iPhone SE I'm able to retrieve the cameraMatrix with kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix attachment on the samplebuffer. Is there a way to retrieve the distortion ?


And finally the last question, I was planning to make some 3D reconstructions from the synchronized pictures. Do you think that the extrinsic matrix given by the phone is enough accurate ? The two cameras are so close (about 10 mm) that a small error on the translation vector may have a significant impact on the true scale of the measured object

May I ask on which device are you running and what is the device.deviceType of the AVCaptureDeviceInput connected to your AVCapturePhotoOutput?

I'm using an iPhone 8 Plus and self.videoDeviceInput.device.deviceType is AVCaptureDeviceTypeBuiltInDualCamera (made of tele and wide).


Today someone lent me an iPhone 11 pro max.


If I'm using AVCaptureDeviceTypeBuiltInTrippleCamera, the property self.photoOutput.cameraCalibrationDataDeliverySupported returns NO


If I'm using AVCaptureDeviceTypeBuiltInDualWideCamera, the property self.photoOutput.cameraCalibrationDataDeliverySupported returns NO


If I'm using AVCaptureDeviceTypeBuiltInDualCamera the property self.photoOutput.cameraCalibrationDataDeliverySupported returns YES

In this last mode, if I'm setting NO or YES to dataDepthDelivery, I get valid cameraCalibrationData


It seems that there are several problems:

1) The first one with my iphone 8 plus (AVCaptureDeviceTypeBuiltInDualCamera), if I don't enable dataDepthDelivery no cameraCalibrationData are available

2) With the iPhone 11 pro, some configurations cannot deliver cameraCalibration data

Did you get time to check why in some cases no cameraCalibrationData are available ?

For information, I'm using an AVCaptureSession and not an AVCaptureMultiCamSession

It looks like there's an error being thrown when cameraCalibrationDataDeliveryEnabled is set to true using AVCaptureDevice.deviceType.builtInTripleCamera. Any idea why that might be happening? Is there any way to get the camera calibration data using the triple camera on the iPhone 11 Pro? Thank you.

When I'm trying to use the ultra wide camera, nothing works.

Only the tele and wide can work on my side

Have you disabled geometric distortion correction on the AVCaptureDevice?

I've set device.isGeometricDistortionCorrectionEnabled to false and photoSettings.isCameraCalibrationDataDeliveryEnabled to true using the .builtInTripleCamera, and I'm getting the following error: settings.cameraCalibrationDataDeliveryEnabled may not be set to YES unless self.cameraCalibrationDataDeliverySupported is YES'.

You're right, the geometric distortion is enabled by default with the ultra wide which prevents the delivery of calibration.

How can we retrieve the camera calibration in the case we use only the AVCaptureDeviceTypeBuiltInWideAngleCamera ?

The following code works for me with AVCaptureDeviceTypeBuiltInTripleCamera on a pro max. The problem was the geometric distortion enabled by default.


    [videoDevice lockForConfiguration:nil];
    [videoDevice setGeometricDistortionCorrectionEnabled:NO];
    [videoDevice unlockForConfiguration];
   

#if defined(USE_AV_CAPTURE_PHOTO_OUTPUT)
    AVCapturePhotoOutput* photoOutput = [[AVCapturePhotoOutput alloc] init];
    if ([self.session canAddOutput:photoOutput]) {
        [self.session addOutput:photoOutput];
        self.photoOutput = photoOutput;
       
        self.photoOutput.highResolutionCaptureEnabled = NO;
        self.photoOutput.livePhotoCaptureEnabled = NO;
        if ([videoDevice isVirtualDevice]){
            BOOL supported = self.photoOutput.virtualDeviceConstituentPhotoDeliverySupported;
            NSLog(@"* virtualDeviceConstituentPhotoDeliverySupported %d", supported);
            self.photoOutput.virtualDeviceConstituentPhotoDeliveryEnabled = supported;
        }
       
        BOOL depthDataDeliverySupported = self.photoOutput.depthDataDeliverySupported;
        //self.photoOutput.depthDataDeliveryEnabled = depthDataDeliverySupported;
        BOOL cameraCalibrationDataDeliverySupported = self.photoOutput.cameraCalibrationDataDeliverySupported;
       
       
        NSLog(@"Constituent %@", [[self.videoDeviceInput device] constituentDevices]);
        NSLog(@"* depthDataDeliverySupported %d", depthDataDeliverySupported);
        NSLog(@"* depthDataDeliveryEnabled %d", self.photoOutput.depthDataDeliveryEnabled);
        NSLog(@"* cameraCalibrationDataDeliverySupported %d", cameraCalibrationDataDeliverySupported);
       
       // self.photoOutput.cameraCalibrationDataDeliverySupported = YES;
    }
#endif   
    NSLog(@"Geom: %d %d", [videoDevice isGeometricDistortionCorrectionSupported], [videoDevice isGeometricDistortionCorrectionEnabled]);

Actually I was just disabling the geometric distortion correction in the wrong place. I'm getting all three camera calibration values now. Thanks!