I have been allocating pixel buffers from CVPixelBufferPool
and the code has been adapted from older various Apple sample codes such as RosyWriter. I see direct API such as CVPixelBufferCreate
are highly performant and rarely cause frame drops as opposed to allocating from pixel buffer pool where I regularly get frame drops. Is this a known issue or a bad use of API?
Here is the code for creating pixel buffer pool:
private func createPixelBufferPool(_ width: Int32, _ height: Int32, _ pixelFormat: FourCharCode, _ maxBufferCount: Int32) -> CVPixelBufferPool? {
var outputPool: CVPixelBufferPool? = nil
let sourcePixelBufferOptions: NSDictionary = [kCVPixelBufferPixelFormatTypeKey: pixelFormat,
kCVPixelBufferWidthKey: width,
kCVPixelBufferHeightKey: height,
kCVPixelFormatOpenGLESCompatibility: true,
kCVPixelBufferIOSurfacePropertiesKey: [:] as CFDictionary]
let pixelBufferPoolOptions: NSDictionary = [kCVPixelBufferPoolMinimumBufferCountKey: maxBufferCount]
CVPixelBufferPoolCreate(kCFAllocatorDefault, pixelBufferPoolOptions, sourcePixelBufferOptions, &outputPool)
return outputPool
}
private func createPixelBufferPoolAuxAttributes(_ maxBufferCount: size_t) -> NSDictionary {
// CVPixelBufferPoolCreatePixelBufferWithAuxAttributes() will return kCVReturnWouldExceedAllocationThreshold if we have already vended the max number of buffers
return [kCVPixelBufferPoolAllocationThresholdKey: maxBufferCount]
}
private func preallocatePixelBuffersInPool(_ pool: CVPixelBufferPool, _ auxAttributes: NSDictionary) {
// Preallocate buffers in the pool, since this is for real-time display/capture
var pixelBuffers: [CVPixelBuffer] = []
while true {
var pixelBuffer: CVPixelBuffer? = nil
let err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes(kCFAllocatorDefault, pool, auxAttributes, &pixelBuffer)
if err == kCVReturnWouldExceedAllocationThreshold {
break
}
assert(err == noErr)
pixelBuffers.append(pixelBuffer!)
}
pixelBuffers.removeAll()
}
And here is the usage:
bufferPool = createPixelBufferPool(outputDimensions.width, outputDimensions.height, outputPixelFormat, Int32(maxRetainedBufferCount))
if bufferPool == nil {
NSLog("Problem initializing a buffer pool.")
success = false
break bail
}
bufferPoolAuxAttributes = createPixelBufferPoolAuxAttributes(maxRetainedBufferCount)
preallocatePixelBuffersInPool(bufferPool!, bufferPoolAuxAttributes!)
And then creating pixel buffers from pool
err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes( kCFAllocatorDefault, bufferPool!, bufferPoolAuxAttributes, &dstPixelBuffer )
if err == kCVReturnWouldExceedAllocationThreshold {
// Flush the texture cache to potentially release the retained buffers and try again to create a pixel buffer
err = CVPixelBufferPoolCreatePixelBufferWithAuxAttributes( kCFAllocatorDefault, bufferPool!, bufferPoolAuxAttributes, &dstPixelBuffer )
}
if err != 0 {
if err == kCVReturnWouldExceedAllocationThreshold {
NSLog("Pool is out of buffers, dropping frame")
}
else {
NSLog("Error at CVPixelBufferPoolCreatePixelBuffer %d", err)
}
break bail
}
When used with AVAssetWriter, I see lot of frame drops caused due to kCVReturnWouldExceedAllocationThreshold
error. No frame drops are seen when I directly allocate the pixel buffer without using a pool:
CVPixelBufferCreate(kCFAllocatorDefault, Int(dimensions.width), Int(dimensions.height), outputPixelFormat, sourcePixelBufferOptions, &dstPixelBuffer)
What could be the cause?