Post

Replies

Boosts

Views

Activity

Comment on CoreMediaIO Camera Extension: custom properties?
Sorry for the delay. I'm back here because I just noticed that Global IOSurfaceRefs no longer work under Ventura. My suggestion for a performance tip is to create your CVPixelBufferRef with a metal texture backing (essentially, an IOSurface) and hope that the CMIO subsystem does the right thing and just marshalls the underlying IOSurface or texture over the process boundary - zero copy if you will - without ever leaving the GPU. If anyone from CoreMedia is reading this, for macos14.
Jan ’23
Comment on CoreMediaIO Camera Extension: custom properties?
The new Apple Silicon machines include hardware encoder(s) on the SOC for compressing H.264 frames and so another option is to compress the frame buffer to H.264 (compressed sample buffer) and then turn it into NSData for the custom property and then decode to CMSampleBuffer in the cam extension. Compressed H264 frames are really small in terms of actual bytes and so should skeedaddle quite quickly over whatever that connection is (XPC transport provided by the system for custom properties).
Jan ’23
Comment on Discover Metal for immersive apps - Example Code
I discovered this thread after porting WarrenM's C++ sample myself. In your code you have declared your uniforms both in ShaderTypes.h and Swift. I believe the correct method is to have a single declaration in ShaderTypes.h and import it into Swift using the bridging header. Apparently the Swift compiler uses different padding and alignment rules for structs. C/C++ have compiler options for altering struct padding and alignment so it's a common issue. Nevertheless, thanks for your sample.
Dec ’23