An AVCaptureSession with a AVCaptureVideoDataOutput delivers video frames from the camera.
OpenGL textures are created from the CVPixelBuffers using a CVOpenGLESTextureCache.
Some OpenGL-based image processing is performed on the frames (with many intermediate steps) in a separate queue.
The final texture of the processing pipeline is rendered into a CAEAGLLayer on the main thread (with proper context and share group handling).
Some observations:
The AVCaptureSession is still running (isRunning is true, isInterrupted is false).
All connections between the camera device and output are still there and active.
The capture indicator (green circle in the status bar, new in iOS 14) is still there.
The output's delegate does not report any frame drops.
When I perform an action that causes the session to be re-configured (like switching to the front camera), the output will start delivering frames again for ~4 sec. and then stop again.
When I don't process and display the frames, the output continues to deliver frames without interruption.
I'm getting the attachments (metadata) of the incoming sample buffers and accidentally leaked the dictionary due to wrong bridging.
So instead of this
Code Block NSDictionary* attachments = (__bridge NSDictionary* _Nullable)CMCopyDictionaryOfAttachments(NULL, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
I should have done this
Code Block NSDictionary* attachments = (__bridge_transfer NSDictionary* _Nullable)CMCopyDictionaryOfAttachments(NULL, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
Interestingly, this leak caused the capture session to stop delivering new sample buffers after 126 frames—without any warning, error, or notification.