Core Media basics, understanding CMSampleBuffer, CMBlockBuffer and CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer

So, I'm trying to learn basics of Core Media since I need to process real time audio samples in my app. For now I know that I need to configure an AVCaptureSession setting an AVCaptureDevice used to acquire samples and an AVCaptureDataOutput that processes the input from the device and "notifies" an AVCaptureAudioDataSampleBufferDelegate through a captureOutput(...) method.

Now, this one method gets passed the samples as an CMSampleBuffer object, that according to Apple's CM documentation, will contain zero or more media (audio in my case) samples and a CMBlockBuffer, that is

[...] a CFType object that represents a contiguous range of data offsets [...] across a possibly noncontiguous memory region.

OK So this is kinda getting confusing. I'm not a native speaker and I'm struggling to understand what this is supposed to mean. Why do I need this to access my samples? Aren't they stored as an array of raw binary data (therefore homogeneous and contiguous)? I guess this is related to how the underlying memory is managed by Core Media but I can't figure it out.

Also the last batch of samples gets accessed through this CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer  method which expects an unsafe mutable pointer to an AudioBufferList and one to an optional CMBlockBuffer. The first one will be filled with pointers into the latter, and then I may (or may not) be able to access the samples through myAudioBufferList.mBuffers.mData, which might be nil.

Example code from Apple Developers code snippets:

public func captureOutput(_ output: AVCaptureOutput,
                              didOutput sampleBuffer: CMSampleBuffer,
                              from connection: AVCaptureConnection) {

        var audioBufferList = AudioBufferList()
        var blockBuffer: CMBlockBuffer?
  
        CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
            sampleBuffer,
            bufferListSizeNeededOut: nil,
            bufferListOut: &audioBufferList,
            bufferListSize: MemoryLayout.stride(ofValue: audioBufferList),
            blockBufferAllocator: nil,
            blockBufferMemoryAllocator: nil,
            flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,
            blockBufferOut: &blockBuffer)
        
        guard let data = audioBufferList.mBuffers.mData else {
            return
        }

}

What's the memory model (or pipeline) behind this? I truly appreciate any help.

I learned a lot about CMBlockBuffer and CMSampleBuffer from this presentation: https://developer.apple.com/videos/play/wwdc2014/513/

See page 37 of the pdf.

CMSampleBuffer contains the CMBlockBuffer, plus metadata like timing info. CMBlockBuffer contains the actual data, which you can access with functions like CMBlockBufferGetDataPointer, CMBlockBufferGetDataLength, CMBlockBufferCopyDataBytes, etc.

Core Media basics, understanding CMSampleBuffer, CMBlockBuffer and CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer
 
 
Q