It seems like the generated video is not streamed onto the disk but rather written to memory for it to be written to disk all at once. How can I force it to stream the data to disk while the encoding is happening?
Btw. my app itself consistently needs around 300MB of memory, so I don't think I have a memory leak here.
Here is the relevant code:
Code Block func analyse() { self.videoWritter = try! AVAssetWriter(outputURL: outputVideo, fileType: AVFileType.mp4) let writeSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264, AVVideoWidthKey: videoSize.width, AVVideoHeightKey: videoSize.height, AVVideoCompressionPropertiesKey: [ AVVideoAverageBitRateKey: 10_000_000, ] ] self.videoWritter!.movieFragmentInterval = CMTimeMake(value: 60, timescale: 1) self.frameInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: writeSettings) self.frameInput?.expectsMediaDataInRealTime = true self.videoWritter!.add(self.frameInput!) if self.videoWritter!.startWriting() == false { print("Could not write file: \(self.videoWritter!.error!)") return } } func writeFrame(frame: Frame) { /* some more code to determine the correct frame presentation time stamp */ let newSampleBuffer = self.setTimeStamp(frame.sampleBuffer, newTime: self.nextFrameStartTime!) self.frameInput!.append(newSampleBuffer) /* some more code */ } func setTimeStamp(_ sample: CMSampleBuffer, newTime: CMTime) -> CMSampleBuffer { var timing: CMSampleTimingInfo = CMSampleTimingInfo( duration: CMTime.zero, presentationTimeStamp: newTime, decodeTimeStamp: CMTime.zero ) var newSampleBuffer: CMSampleBuffer? CMSampleBufferCreateCopyWithNewTiming( allocator: kCFAllocatorDefault, sampleBuffer: sample, sampleTimingEntryCount: 1, sampleTimingArray: &timing, sampleBufferOut: &newSampleBuffer ) return newSampleBuffer! }
My specs:
MacBook Pro 2018
32GB Memory
macOS Big Sur 11.1
1.) The kernel_task:
The following code contains the memory leak:
Code Block let croppedImage = frame.image.cropped(to: CGRect( x: Int(Int(videoSize.width) / 2 - (self.RECOGNITION_WIDTH / 2)), y: Int(self.videoSize.height - CGFloat(self.RECOGNITION_HEIGHT)), width: self.RECOGNITION_WIDTH, height: self.RECOGNITION_HEIGHT )
Apparently CIImage.cropped(to:) leads to the original CIImage not being cleaned up. According to my Google search I should have called CGImageRelease(:) but that is not contained in Swift 5.1 since Core Foundation Objects are now memory managed. (At least that is the Xcode error message.)
So I convert the image to a CGImage, crop that one and turn it into a CIImage. I know this is not efficient, but for me that's fine. So here is the new code:
Code Block import Foundation import CoreImage extension CIContext { func cropWithoutMemoryLeak(_ image: CIImage, to rect: CGRect) -> CIImage { let newImage = self.createCGImage(image, from: rect) return CIImage(cgImage: newImage!) } }
2.) The VTDecoderXPCService:
I was able to reliably remove the memory leak by commenting out this line, which obviously defeats the point of creating a new video:
Code Block self.frameInput!.append(newSampleBuffer)
After trying some more stuff I removed some of my debug print statements which I had throughout my code. And all the sudden the leak was gone...? 🤷♀️ So I don't really know what the problem was. Removing the debug lines was most certainly a correlation but no causation. I guess it triggered a recompile of previously cached files but admittedly I have no idea, what the problem was.