I have found that setting the contents property of a CALayer is an effective drawing technique but when used with IOSurfaceRef or CVPixelBufferRef it is necessary to double buffer the incoming surfaces because if you set the CALayer.contents twice in a row with the same CVPixelBufferRef it does not display the updated contents.
On earlier versions of Mac OS it seems like the color matrix attachments of the CVPixelBuffer are not interpreted or ignored so colors may be off but on recent versions of MacOS the color matrix attachments of the CVPixelBuffer will be applied when rendering the CALayer.contents making this a very powerful technique and frankly eliminating huge amounts of drawing code in Metal or OpenGL.
Post
Replies
Boosts
Views
Activity
I just wanted to note that the CoreAudio team has done a superb job of dealing with these same basic problems. I develop both coreaudiod drivers and CoreMediaIO DAL drivers and note the following:
Under Apple Silicon (Big Sur, Monterrey) CoreAudio loads code signed Intel binary audio output drivers just fine, it appears to run them in a special Intel service helper that runs under Rosetta2. Even in an app that has not adopted the Hardened runtime AVFoundation only vends CMIO DAL drivers if they have the appropriate fat binary code. The CoreAudio support is seamless and invisible to the user and ideally we'd have the same situation for CMIO sometime in the near future.
So QuickTime Player is unable to access even Apple Silicon signed CMIO DAL drivers even while exposing Intel versions of analogous audio drivers. Currently AVFoundation is loading CMIO DAL drivers into the process using the driver. I believe this code should be run in a service process. QTKit used this technique and it was very fast so it's not a performance issue AFAIK.
I believe that Apple, by virtue of the fact that Apple code signed libraries are approved by the Hardened runtime, could create a CMIO DAL driver that would seamlessly "remote" all the 3rd party (code signed) CMIO DAL drivers into an XPC process. I believe this technique was used to allow QTKit to vend 32-bit VDIG drivers to 64-bit applications.
I think it's a lot to expect that end users understand these types of issues: if they install an app that includes a virtual camera driver (or in the case of Blackmagic, a hardware device), I think it's clear their intent is to make use of those features.
ObjC is what I haz. I believe you must first expose the key via the availableProperties method on your CMIOExtensionStreamSource.
From the header:
typedef NSString *CMIOExtensionProperty NS_TYPED_ENUM API_AVAILABLE(macos(12.3));
From the sample/template code:
- (NSSet<CMIOExtensionProperty> *)availableProperties {
// return [NSSet setWithObjects:, CMIOExtensionPropertyStreamFrameDuration, nil];
CMIOExtensionProperty myCustomPropertyKey = @"4cc_cust_glob_0000";
return [NSSet setWithObjects:
CMIOExtensionPropertyStreamActiveFormatIndex,
myCustomPropertyKey,
nil];
}
In the client application you use the CoreMedia C API to set/get the property value:
CMIOObjectPropertyAddress myCustomPropertyAddress = {
.Selector = FOUR_CHAR_CODE('cust')
.Scope = kCMI0ObjectPropertyScopeGlobal,
.mElement = kCMI0ObjectPropertyElementMain };
// CMIOObjectHasProperty(object,&myCustomPropertyAddress);
// CMIOObjectGetPropertyData(object, &myCustomPropertyAddress, 0, NULL, *propertyDataSize, propertyDataSize, propertyValue));
To see how to query the device list, open streams and retrieve property values, this sample may be useful:
https://github.com/bangnoise/cmiotest
When I attempt to access my custom property (on a stream), I get this error:
CMIO_DAL_CMIOExtension_Stream.mm:1165:GetPropertyData 50 wrong 4cc format for key 4cc_cust_glob_0000
CMIO_DAL_CMIOExtension_Stream.mm:1171:GetPropertyData unknown property error cust
CMIOHardware.cpp:328:CMIOObjectGetPropertyData Error: 2003332927, failed
This message is only triggered if I attempt to access the property from the client side.
However, the os_log output shows the correct value for the property.
Error code 2003332927 is FOUR_CHAR_CODE('who?') which maps to kCMIOHardwareUnknownPropertyError
I have determined experimentally that under Objective-C (at least), the format of the key for a custom property is as follows:
CMIOExtensionProperty myCustomPropertyKey = @"cust_glob_0000";
The fcc_ prefix is either Swift-specific or superfluous. I'm on 12.3.
Still no data returned, the value I get is always 0.
I must recant my last comment. I was successful in retrieving custom properties on both Device and Stream using the following code:
const CMIOExtensionProperty CMIOExtensionPropertyCustomPropertyData_just = @"4cc_just_glob_0000";
const CMIOExtensionProperty CMIOExtensionPropertyCustomPropertyData_dust = @"4cc_dust_glob_0000";
const CMIOExtensionProperty CMIOExtensionPropertyCustomPropertyData_rust = @"4cc_rust_glob_0000";
+ (NSMutableDictionary*)sampleCustomPropertiesWithPropertySet:(NSSet<CMIOExtensionProperty>*)properties {
NSMutableDictionary* dictionary = [NSMutableDictionary dictionary];
if ([properties containsObject:CMIOExtensionPropertyCustomPropertyData_just]) {
CMIOExtensionPropertyState* propertyState = [CMIOExtensionPropertyState propertyStateWithValue:@"Property value for 'just'"];
[dictionary setValue:propertyState forKey:CMIOExtensionPropertyCustomPropertyData_just];
}
if ([properties containsObject:CMIOExtensionPropertyCustomPropertyData_dust]) {
const size_t sData_length = 12;
static const unsigned char sData[sData_length] = { 0xFE,0xED,0xFE,0xED, 0xFE,0xED,0xFE,0xED, 0xFE,0xED,0xFE,0xED };
CMIOExtensionPropertyState* propertyState = [CMIOExtensionPropertyState propertyStateWithValue:[NSData dataWithBytes:sData length:sData_length]];
[dictionary setValue:propertyState forKey:CMIOExtensionPropertyCustomPropertyData_dust];
}
if ([properties containsObject:CMIOExtensionPropertyCustomPropertyData_rust]) {
NSString* propertyValue = [NSString stringWithFormat:@"Custom property value for property '%@'.", CMIOExtensionPropertyCustomPropertyData_rust ];
CMIOExtensionPropertyState* propertyState = [CMIOExtensionPropertyState propertyStateWithValue:propertyValue];
[dictionary setValue:propertyState forKey:CMIOExtensionPropertyCustomPropertyData_rust];
}
return dictionary;
}
Then for CMIOExtensionDeviceSource, I passed the custom property values as a dictionary to the initializer:
- (NSSet<CMIOExtensionProperty> *)availableProperties {
return [NSSet setWithObjects:
CMIOExtensionPropertyDeviceTransportType,
CMIOExtensionPropertyDeviceModel,
CMIOExtensionPropertyCustomPropertyData_just,
CMIOExtensionPropertyCustomPropertyData_dust,
CMIOExtensionPropertyCustomPropertyData_rust,
nil];
}
- (nullable CMIOExtensionDeviceProperties *)devicePropertiesForProperties:(NSSet<CMIOExtensionProperty> *)properties
error:(NSError * _Nullable *)outError {
NSMutableDictionary* dictionary = [self sampleCustomPropertiesWithPropertySet:properties];
CMIOExtensionDeviceProperties *deviceProperties = [CMIOExtensionDeviceProperties devicePropertiesWithDictionary:dictionary];
And for CMIOExtensionStreamSource:
- (NSSet<CMIOExtensionProperty> *)availableProperties {
return [NSSet setWithObjects:
CMIOExtensionPropertyStreamActiveFormatIndex,
CMIOExtensionPropertyStreamFrameDuration,
CMIOExtensionPropertyCustomPropertyData_just,
CMIOExtensionPropertyCustomPropertyData_dust,
CMIOExtensionPropertyCustomPropertyData_rust,
nil];
}
- (nullable CMIOExtensionStreamProperties *)streamPropertiesForProperties:(NSSet<CMIOExtensionProperty> *)properties
error:(NSError * _Nullable *)outError {
NSMutableDictionary* dictionary = [self sampleCustomPropertiesWithPropertySet:properties];
CMIOExtensionStreamProperties* streamProperties = [CMIOExtensionStreamProperties streamPropertiesWithDictionary:dictionary];
This error message below is not related to the actual format of the key, the error is regarding the format of the propertyState (ie. the data associated with the key).
CMIO_DAL_CMIOExtension_Stream.mm:1165:GetPropertyData 50 wrong 4cc format for key 4cc_cust_glob_0000
I haven't tried on the Ventura beta but on 12.4 I am in the practice of rebooting after every update to my extension. If you're having problems it might be due to the fact that an old instance of your extension is being used by the system.
You can keep a counter of the client connection and disconnection events.
I haven't specifically tried it but you should be able to serialize any NSCoding object into your NSData and then de-serialize it in the client extension.
Thanks for this tip - after reading this I looked at my code and I was propagating the sampleTimingInfo that originated in the client app all the way through because, well, that seemed like the smart thing to do. I, too, sometimes see strange behaviour and I'm hoping this will make it more robust.
os_log message after sink stream stopped and started creating a gap in the CMSampleTimingInfo:
CMIO_Unit_Synchronizer_Video.cpp:1256:SyncUsingIntegralFrameTiming creating discontinuity because the timebase jumped
This would seem to imply that CMIO automagically adapts CMIOExtensionStreamDiscontinuityFlags if it detects that the flags passed in are incorrect (i.e. there is a discontinuity even if the flag passed was CMIOExtensionStreamDiscontinuityFlagNone - which is what my code is doing at present).
I modified my code to update the CMSampleTimingInfo of the CMSampleBuffers as they pass through the extension as per this post:
https://developer.apple.com/forums/thread/725481
and no longer seeing this message:
CMIO_Unit_Synchronizer_Video.cpp:1435:SyncUsingIntegralFrameTiming observing getting frames too quickly by a lot
Getting this os_log message sporadically while running QuickTime Player only (record ready mode) and switching back and forth between two different CMIO Extensions.
__CMIO_Unit_Input_HAL.cpp:1724:DoInputRenderCallback Dropping 512 frames, because we're out of buffers
__
Annotating this thread subsequent to the transition to Apple Silicon, which is basically complete at the time of this writing. I think the methodology proposed at the top of this discussion is a workable and effective strategy for dealing with this problem which is going to become more and more pervasive. Many new SDKs from Apple will be thread-safe and capable of generating KVO notifications on any thread or queue. However, I think it unlikely at AppKit and UIKit will be thread safe. And there's the challenge of supporting the widest array of Mac hardware.
The Objective-C runtime already has solutions for this problem which date back to a technology called Distributed Objects. Not much has been said about this tech for a long while because security concerns promoted XPC to the foreground but XPC doesn't really provide the same functionality.
The point here is that the NSProxy NSInvocation classes and patterns can be used to "remote" almost any Objective-C method call, including over thread boundaries, process boundaries and to remote machines on the LAN or interweb. Check out NSDistantObject for some perspective.
You can build a controller layer whose sole purpose is to proxy all KVO notifications onto the main thread to AppKit.
Take this sample code from the archive for example:
https://developer.apple.com/library/archive/samplecode/AVRecorder/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011004
I have refactored and referred to this sample several times; it's an excellent sample. But as of 2023, the KVO bindings from the UI are not properly working. Exceptions are being thrown and KVO notifications lost, leading to indeterminate states of the UI. Maybe these are bugs in AppKit that will be remedied (sometime in the future).
However, I was easily able to solve the problems with this sample by building a controller layer between AppKit and AVCaptureDevice et al. This was before I found NSInvocation and basically I am dispatching to the main thread. My solution is just a simple proxy object that forwards all the valueForKeyPath type methods to the target objects (I have one controller bound to all the various AVCaptureDevice type key paths). It's a very simple class and has restored this sample code to its original lustre and glory. But it could be even simpler next time I revisit the code:
For my next Cocoa nightmare I dove deeper into NSInvocation and learned that you can completely remote an entire class with just four Objective-C methods. Check out the docs for methodSignatureForSelector: and go down the rabbit hole:
from NSInvocation:
+ (NSInvocation *)invocationWithMethodSignature:(NSMethodSignature *)sig;
- (void)invokeWithTarget:(id)target;
- (void)forwardInvocation:(NSInvocation *)invocation;
- (id)forwardingTargetForSelector:(SEL)methodSelector;`
You'll get warnings from a modern Objective-C compiler so declare the exposed keypaths/properties as usual and mark them as @dynamic so the compiler doesn't synthesize methods for them. Once you get to googling NSInvocation and any of the four methods listed above, I think you'll find much has been written on this subject going back to Panther and even OpenStep.
https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/DistrObjects/DistrObjects.html
I am quite familiar with the older DAL architecture, having built a large number of camera plug-ins dating back to 2011. And before that also VDIG components which were the predecessor to DAL. I am not aware of any API in the C-language CoreMediaIO framework that restricts scalar property data to a specific value range. It's possible that this is part of some kind of validation layer new to CMIO Extensions aimed at preventing invalid values from being sent to your custom properties.
The available property set/get methods (such as you're using) are declared in the CoreMediaIO.framework CMIOHardwareObject.h
I believe these are the only functions for accessing properties from the client side. Try sending an out-of-range value to your property declared with max and min and see if something pops up in the os_log
If you have a more complex property data type (such as your ranged values), my suggestion is to serialize it into a NSDictionary and then serialize the NSDictionary into an NSData object. Pass the NSData over the custom property connection. You could also use NSKeyedArchiver to flatten (into NSData) an arbitrary NSObject-derived class conforming to NSCoding. Or JSON data serialized in an NSString.
In my testing, older DAL plugins are already completely deprecated on MacOS Ventura (the WWDC presentation by Brad Ford says Ventura will be the last OS to support DAL). But it can be helpful to have a plugin built with the legacy DAL system so you can see exactly how the property data is communicated in the old system before trying to migrate to the new extension system. Monterey appears to be the last version of MacOS to support DAL CFPlugIn components and as such, probably the preferred OS for developing CMIO solutions.
I would recommend against using this sample code (below) for production because there are some serious race conditions in this code (ie. when multiple AVCaptureSession instances access the 'object property store' without any locks or queues. But for getting the gist of property data flow, this will give you both sides of the equation within a single process that Xcode can easily debug:
https://github.com/johnboiles/coremediaio-dal-minimal-example
Once you have it completely grokked, then migrate to the new CMIO Extension property system.
There's a Swift adaptation of johnboiles sample code that's even more heinous because it pulls a Swift runtime into the host process - and that's an exciting party if the host was built with a different version of Swift. But if you're just using it for scaffolding, it may serve your needs.
Answering my own question, sorta kinda:
Deep in the heart of AVFoundation.framework in AVCaptureVideoDataOutput.h we find the following comment:
@method captureOutput:didOutputSampleBuffer:fromConnection:
@abstract
Called whenever an AVCaptureVideoDataOutput instance outputs a new video frame.
Note that to maintain optimal performance, some sample buffers directly reference pools of memory that may need to be reused by the device system and other capture inputs. This is frequently the case for uncompressed device native capture where memory blocks are copied as little as possible. If multiple sample buffers reference such pools of memory for too long, inputs will no longer be able to copy new samples into memory and those samples will be dropped.
It is my belief that Photo Booth (not unlike my own test client) is passing the underlying CVPixelBufferRef (obtained from CMSampleBufferGetImageBuffer) directly into a [CIImage imageWithCVPixelBuffer:] (or similar) and then passing that CIImage into a CIFilter graph for rendering to a CIContext. Or possibly doing this indirectly via an inputKey on QCRenderer.
If Photo Booth is using CMSampleBufferRefs in this way (i.e. not necessarily following the above advice) then I think it is safe to assume there may be a wide variety of other camera apps which also hold on to CMSampleBuffers obtained from a CMIO extension for arbitrary periods of time.
Contrary to the CMIO extension sample code and the advice in the AVCaptureVideoDataOutput.h header, it may be best to allocate your camera frame buffers individually rather than from a pool. Or if your pool runs out of buffers, have a failsafe path that vends individual buffers until the pool is safe for diving.