I am having trouble with a special camera application and am posting to ask for your wisdom.
I am implementing a process to shoot 4k video in the AVFoundation framework. The app does not take video, but rather attaches a film camera shutter mechanism in front of the iPhone camera and acquires still images by the film camera shutter release.
For details, please visit the following web site;
https://www.digi-swap.com/
Until the film camera shutter is released, the iPhone camera gains a complete black image. When the shutter of the film camera is released, for example, the shutter speed is 1/30s, the image (light coming through the film camera's lens) for 1/30s is irradiated to the image sensor of the iPhone, and this image is acquired as a still image.
Naturally, there will be insufficient light, so the aperture of the iPhone camera is opened to maximum and a shutter speed of 1/1s or longer is used. Focus is also fixed.
When taking a picture under these conditions, the center of the still image is relatively bright, but the periphery is dark.
I guess the cause of this is a problem with less launching time of the iPhone camera from complete black to 1/30s light reception. In other words, it may be because the camera does not have enough time to capture a clean image.
I am hoping to confirm if my understanding is correct with the engineers who are developing the camera area at Apple, and hearing back from you would be highly appreciated.
Best regards,