Is it possible to get the camera intrinsic matrix for a captured single photo on iOS?
I know that one can get the cameraCalibrationData
from a AVCapturePhoto
, which also contains the intrinsicMatrix
. However, this is only provided when using a constituent (i.e. multi-camera) capture device and setting virtualDeviceConstituentPhotoDeliveryEnabledDevices
to multiple devices (or enabling isDualCameraDualPhotoDeliveryEnabled
on older iOS versions). Then photoOutput(_:didFinishProcessingPhoto:)
is called multiple times, delivering one photo for each camera specified. Those then contain the calibration data.
As far as I know, there is no way to get the calibration data for a normal, single-camera photo capture.
I also found that one can set isCameraIntrinsicMatrixDeliveryEnabled
on a capture connection that leads to a AVCaptureVideoDataOutput
. The buffers that arrive at the delegate of that output then contain the intrinsic matrix via the kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix
metadata. However, this requires adding another output to the capture session, which feels quite wasteful just for getting this piece of metadata. Also, I would somehow need to figure out which buffer was temporarily closest to when the actual photo was taken.
Is there a better, simpler way for getting the camera intrinsic matrix for a single photo capture?
If not, is there a way to calculate the matrix based on the image's metadata?