Clarification on appropriate values for CMIOExtensionStreamProperties

CMIO Extension - mostly working; I'm trying to narrow down some edge cases which appear to be related to the buffering of CMSampleBuffers arriving via a sink stream and being subsequently vended to clients via a source stream. The sink stream may not be an issue, I cross posted about the boilerplate CMIO extension having problems with multiple clients.

In particular I have a case where two clients are running and the flow of sample buffers is smooth until I connect a third client (based loosely on RosyWriter ported to Mac). I think this third client has a much different cadence of consuming the CMSampleBuffers because it's using AVAssetWriter to asynchronously write the buffers to disk. It seems under this load the CVPixelBufferPool inside the extension runs out of CVPixelBuffers. For reference, my CFPlugIn DAL sample driver is functioning correctly (obviously not on Ventura) with this RosyWriter-esque client so I suspect its the buffering between the client and the extension.

Hoping for some hints about what kind of buffering is between the client and the extension.

To that end, is there any clarification on:

  1. What values might be appropriate for CMIOExtensionPropertyStreamSinkBufferQueueSize? I have it randomly set to 3 in my complex extension.
  2. Also, if the sink client disconnects, it is necessary to set CMIOExtensionPropertyStreamSinkEndOfData?
  3. Does the CMIO system read or interpret CMIOExtensionPropertyStreamSinkBufferUnderrunCount?

Is it even necessary to expose any of these properties to the CMIO subsystem? Or can I remove them and let CMIO infer default values?

Some os_log messages that seem to be related to presentation timestamps:

CMIO_Unit_Synchronizer_Video.cpp:1435:SyncUsingIntegralFrameTiming observing getting frames too quickly by a lot: 284681993 : 100000 < 284682999 : 100000 (-0.0101), prev time / dur: 284682180 : 100000 / 8189 : 100000 : dropping this frame

CMIO_Unit_Synchronizer_Video.cpp:1470:SyncUsingIntegralFrameTiming here's the normal bounding info: 284681993 : 100000 < 284686275 : 100000 (-0.0428), prev time / dur: 284682180 : 100000 / 8189 : 100000 : dropping this frame

Replies

os_log message after sink stream stopped and started creating a gap in the CMSampleTimingInfo:

CMIO_Unit_Synchronizer_Video.cpp:1256:SyncUsingIntegralFrameTiming creating discontinuity because the timebase jumped

This would seem to imply that CMIO automagically adapts CMIOExtensionStreamDiscontinuityFlags if it detects that the flags passed in are incorrect (i.e. there is a discontinuity even if the flag passed was CMIOExtensionStreamDiscontinuityFlagNone - which is what my code is doing at present).

I modified my code to update the CMSampleTimingInfo of the CMSampleBuffers as they pass through the extension as per this post:

https://developer.apple.com/forums/thread/725481

and no longer seeing this message:

CMIO_Unit_Synchronizer_Video.cpp:1435:SyncUsingIntegralFrameTiming observing getting frames too quickly by a lot

You might be sending or consuming sample buffers repeatedly.

In your your device source (CMIOExtensionDeviceSource) implementation, you should have methods that start/stop the source stream and another pair that start/stop the sink stream.

@interface WhateverExtensionDeviceSource : NSObject<CMIOExtensionDeviceSource>
{
    ...
    CMSampleBufferRef _sampleBuffer;
}
...
- (void)startStreaming;
- (void)stopStreaming;
- (void)startStreamingSink:(CMIOExtensionClient *)client;
- (void)stopStreamingSink:(CMIOExtensionClient *)client;    
@end

Notice the reference to a sample buffer (_sampleBuffer).

The sink stream's "start" method should keep a reference to the client (the application feeding samples to the extension via the queue):

@implementation WhateverExtensionStreamSink
...
- (BOOL)authorizedToStartStreamForClient:(CMIOExtensionClient *)client
{
    _client = client;
    return YES;
}
- (BOOL)startStreamAndReturnError:(NSError * _Nullable *)outError
{
    WhateverExtensionDeviceSource *deviceSource = (WhateverExtensionDeviceSource *)_device.source;
    [deviceSource startStreamingSink:_client];
    return YES;
}

In that method you should have a block running repeatedly on a timer. Here, you consume a sample buffer from the sink stream (consumeSampleBufferFromClient), keep a reference to it (_sampleBuffer = CFRetain(sample_buffer)), and notify the stream that you got the sample buffer (notifyScheduledOutputChanged):

- (void)startStreamingSink:(CMIOExtensionClient *)client
{
    _streamingCounterSink++;
    _timerSink = dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, DISPATCH_TIMER_STRICT, _timerQueueSink);
    dispatch_source_set_timer(_timerSink, DISPATCH_TIME_NOW, (uint64_t) (1e9 / (2 * kFrameRate)), 0);
    dispatch_source_set_event_handler(_timerSink, ^{
        [self->_streamSink.stream
            consumeSampleBufferFromClient:client
            completionHandler:^(
                CMSampleBufferRef sample_buffer,
                uint64_t sample_buffer_sequence_number,
                CMIOExtensionStreamDiscontinuityFlags discontinuity,
                BOOL has_more_sample_buffers,
                NSError *error
            ) {
                if (sample_buffer == nil)
                    return;
                if (self->_sampleBuffer == nil) {
                    self->_sampleBuffer = CFRetain(sample_buffer);
                }
                CMIOExtensionScheduledOutput *output = [
                    [CMIOExtensionScheduledOutput alloc]
                    initWithSequenceNumber:sample_buffer_sequence_number
                    hostTimeInNanoseconds:...
                ];
                [self->_streamSink.stream notifyScheduledOutputChanged:output];
            }
        ];
    });
    dispatch_source_set_cancel_handler(_timerSink, ^{
    });
    dispatch_resume(_timerSink);
}

You have a similar block on a timer in the source stream's "start" method. There, you wait for the sample buffer reference to change from nil to non-nil, and pass it on to the source stream (sendSampleBuffer), and finally release _sampleBuffer and reset it to nil:

dispatch_source_set_event_handler(_timerSource, ^{
    if (self->_sampleBuffer != nil) {
        if (self->_streamingCounterSource > 0) {
            CMTime time = CMClockGetTime(CMClockGetHostTimeClock());
            Float64 ns = CMTimeGetSeconds(time) * 1e9;
            CMSampleBufferRef sbuf = nil;
            CMSampleTimingInfo timing_info = {
                .presentationTimeStamp = time,
            };
            OSStatus err = CMSampleBufferCreateCopyWithNewTiming(
                kCFAllocatorDefault,
                self->_sampleBuffer,
                1,
                (const CMSampleTimingInfo []) {timing_info},
                &sbuf
            );
            [self->_streamSource.stream
                sendSampleBuffer:sbuf
                discontinuity:CMIOExtensionStreamDiscontinuityFlagNone
                hostTimeInNanoseconds:ns
            ];
            CFRelease(sbuf);
        }
        CFRelease(self->_sampleBuffer);
        self->_sampleBuffer = nil;
    }
});

Notice:

  • _sampleBuffer is retained in the sink timer block and released in the source timer block;
  • the source and sink timers are set to twice the frame rate, so you don't miss a frame;

Answering my own question, sorta kinda:

Deep in the heart of AVFoundation.framework in AVCaptureVideoDataOutput.h we find the following comment:

@method captureOutput:didOutputSampleBuffer:fromConnection:
@abstract
Called whenever an AVCaptureVideoDataOutput instance outputs a new video frame.
Note that to maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the case for uncompressed device native capture where memory blocks are copied as little as possible. If multiple sample buffers reference such pools of memory for too long, inputs will no longer be able to copy new samples into memory and those samples will be dropped.

It is my belief that Photo Booth (not unlike my own test client) is passing the underlying CVPixelBufferRef (obtained from CMSampleBufferGetImageBuffer) directly into a [CIImage imageWithCVPixelBuffer:] (or similar) and then passing that CIImage into a CIFilter graph for rendering to a CIContext. Or possibly doing this indirectly via an inputKey on QCRenderer.

If Photo Booth is using CMSampleBufferRefs in this way (i.e. not necessarily following the above advice) then I think it is safe to assume there may be a wide variety of other camera apps which also hold on to CMSampleBuffers obtained from a CMIO extension for arbitrary periods of time.

Contrary to the CMIO extension sample code and the advice in the AVCaptureVideoDataOutput.h header, it may be best to allocate your camera frame buffers individually rather than from a pool. Or if your pool runs out of buffers, have a failsafe path that vends individual buffers until the pool is safe for diving.