Thanks for answering me
I did what you adviced me and it worked 🙂. I have the results I want directly in the property cameraCalibration data of AVCapturePhoto for both pictures. But I noticed something strange.
What I'm trying to do here, it is just taking synchronized pictures with associated intrinsics, distortion and extrinsics data for each frame.
So I'm interested only by the property cameraCalibrationData of AVCapturePhoto.
If I am not setting the property depthDataDeliveryEnabled to YES of AVCapturePhotoOutput, I will not get valid cameraCalibrationData for AVCapturePhoto (nil). I assume when this property is set to YES, the phone has to make some additional processings. I do not want to waste some resources for data I don't need, I have to do some processing by my own once I get the two frames. Maybe it is my code
The code when I'm configuring the AVCapturePhotoOutput
AVCapturePhotoOutput* photoOutput = [[AVCapturePhotoOutput alloc] init];
if ([self.session canAddOutput:photoOutput]) {
[self.session addOutput:photoOutput];
self.photoOutput = photoOutput;
self.photoOutput.highResolutionCaptureEnabled = YES;
self.photoOutput.livePhotoCaptureEnabled = NO;
if ([videoDevice isVirtualDevice]){
self.photoOutput.virtualDeviceConstituentPhotoDeliveryEnabled = YES;
}
BOOL depthDataDeliverySupported = self.photoOutput.depthDataDeliverySupported;
self.photoOutput.depthDataDeliveryEnabled = depthDataDeliverySupported;
NSLog(@"depthDataDeliverySupported %d", depthDataDeliverySupported);
NSLog(@"depthDataDeliveryEnabled %d", self.photoOutput.depthDataDeliveryEnabled);
BOOL cameraCalibrationDataDeliverySupported = self.photoOutput.cameraCalibrationDataDeliverySupported;
NSLog(@"cameraCalibrationDataDeliverySupported %d", cameraCalibrationDataDeliverySupported);
}
and the one when I'm configuring the AVCapturePhotoSettings
NSDictionary * dict =@{
(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA],
};
AVCapturePhotoSettings* photoSettings = [AVCapturePhotoSettings photoSettingsWithFormat:dict];
AVCaptureDevice * device = self.videoDeviceInput.device;
if ([device isVirtualDevice]){
// We have several constituent cameras
photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices = [device constituentDevices];
photoSettings.depthDataDeliveryEnabled = self.photoOutput.isDepthDataDeliveryEnabled;
NSLog(@"Virtual %@ %d",
photoSettings.virtualDeviceConstituentPhotoDeliveryEnabledDevices,
photoSettings.depthDataDeliveryEnabled);
}
photoSettings.cameraCalibrationDataDeliveryEnabled = self.photoOutput.cameraCalibrationDataDeliverySupported;
NSLog(@"cameraCalibrationDataDeliveryEnabled %d", photoSettings.cameraCalibrationDataDeliveryEnabled);
if (self.videoDeviceInput.device.isFlashAvailable) {
photoSettings.flashMode = AVCaptureFlashModeOff;
}
photoSettings.highResolutionPhotoEnabled = YES;
if (photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0) {
photoSettings.previewPhotoFormat = @{ (NSString*)kCVPixelBufferPixelFormatTypeKey : photoSettings.availablePreviewPhotoPixelFormatTypes.firstObject };
}
//photoSettings.photoQualityPrioritization = AVCapturePhotoQualityPrioritizationSpeed;
[self.photoOutput capturePhotoWithSettings:photoSettings delegate:self.capturePhotoDelegate];
Can you please tell me if I've done something wrong or if the system requires enabling the depth delivery ?
I was planning to support all iPhones supported by iOS 13 like iPhone SE (first generation). It has only one camera at the back. Today with this code I cannot get the cameraCalibrationData.
My program will also uses AVCaptureVideoDataOutput. When I'm using it with an iPhone SE I'm able to retrieve the cameraMatrix with kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix attachment on the samplebuffer. Is there a way to retrieve the distortion ?
And finally the last question, I was planning to make some 3D reconstructions from the synchronized pictures. Do you think that the extrinsic matrix given by the phone is enough accurate ? The two cameras are so close (about 10 mm) that a small error on the translation vector may have a significant impact on the true scale of the measured object