I have built a camera application which uses a AVCaptureSession
with the AVCaptureDevice
set to .builtInDualWideCamera
and isVirtualDeviceConstituentPhotoDeliveryEnabled=true
to enable delivery of "simultaneous" photos (AVCapturePhoto
) for a single capture request.
Our app ideally would have the timestamp difference between the photos in a single capture request as short as possible, but we don't have a good idea of what the theoretical or practical limits of this timestamp difference are.
In my testing on an iPhone 12 Pro, with a frame rate of 33Hz and the preset set to hd1920x1080
, I get the timestamp difference between photos at approx 0.3ms, which seems smaller than I would expect, unless the frames are being synchronised incredibly well under the hood.
This leaves the following unanswered questions:
- What sort of ranges of values should we expect to come out of these timestamp differences between photos?
- What factors influence this?
- Is there any way to control these values to ensure they are as small as possible? (Will likely be answered by (2))