How to setup AVFoundation based view to display CoreImage filtered Live video from Camera?

Hi. I watched the short "Optimize the Core Image pipeline for your video app" session several times, all excited, because it is at least 5 years that CI code-samples for MacOS are nonfunctional, and failed to make them work.

My application (for Microscopy) needs to apply (mostly built-in) CI filters onto live video coming from an external IIDC/DCam camera. My current code is AVFoundation based, but I never managed to apply filters onto the video, and as I said Documentation and code-samples deteriorated to the point of no use.

This beautiful session shows how to create an AVPlayerView and set it up for displaying CI-filtered video - but only for an "asset" (meaning - dead video from disk). How to do it for live "Preview" video?

Also, is it too much to ask for a simple working order modern MacOS sample of CoreImage use?

Thanks!

(BTW I would be glad to move to MTKView instead- have I known how to tie the camera input to it. Again - Docs are unusable, and no reasonable sample exists for MacOS. Not even for the simplest of tasks.
Hi suMac,

I just uploaded a sample project that covers this use case: https://github.com/frankschlegel/core-image-by-example
It's still very new and I haven't tested macOS yet, but on iOS it's working so far.

Any feedback is welcome! 🙂
First off, many thanks!!! The sample clearly does what I (need but failed) to do.
Sample is completely modern (SwiftUI, swift 5, MacOS SDK 10.16 etc.) whereas my app's code is all Obj-C, and must run on earlier MacOS versions then Catalina, so it seems I will have lots of work "decompiling" all modern stuff back to the basics I need to use.

Will be back with questions, or "hooray!" style reports soon.

B.T.W If you use SwiftUI, and programmed this on the Mac, how come you can't simply select the Mac target and test-run it? I'll do it of course (once I get to my office desktop running Catalina. My Mac is 10.14 + Xcode 11.5 on which the sample can't be built.

The labs using my Microscopy App have older equipment, and must retain compatibility with old cameras, hardware connections, etc. So I can't always use "latest and greatest".

Thanks again.
You are right. I decided to use newest SDKs and SwiftUI in order to learn how to best integrate Core Image workflows with them.
And yes, it should work on a Mac, but that needs to run Big Sur and I haven't tested it yet. I tested on my iPad with iOS 14. Will check macOS soon.

However, all the relevant APIs (especially MTKView) have been there for a while and should work the same way in older versions and in Objective-C. The important part is the setup of the MTKView and the draw method. If you follow this path, you should be good:

AVCaptureDeviceInput → AVCaptureVideoDataOutput → AVCaptureVideoDataOutputSampleBufferDelegate → CVPixelBufferCIImage → applying CIFilters → CIImage → render into MTKView using a CIContext.

Looking forward to your report! 🙂
How to setup AVFoundation based view to display CoreImage filtered Live video from Camera?
 
 
Q