Hi. I watched the short "Optimize the Core Image pipeline for your video app" session several times, all excited, because it is at least 5 years that CI code-samples for MacOS are nonfunctional, and failed to make them work.
My application (for Microscopy) needs to apply (mostly built-in) CI filters onto live video coming from an external IIDC/DCam camera. My current code is AVFoundation based, but I never managed to apply filters onto the video, and as I said Documentation and code-samples deteriorated to the point of no use.
This beautiful session shows how to create an AVPlayerView and set it up for displaying CI-filtered video - but only for an "asset" (meaning - dead video from disk). How to do it for live "Preview" video?
Also, is it too much to ask for a simple working order modern MacOS sample of CoreImage use?
Thanks!
(BTW I would be glad to move to MTKView instead- have I known how to tie the camera input to it. Again - Docs are unusable, and no reasonable sample exists for MacOS. Not even for the simplest of tasks.
My application (for Microscopy) needs to apply (mostly built-in) CI filters onto live video coming from an external IIDC/DCam camera. My current code is AVFoundation based, but I never managed to apply filters onto the video, and as I said Documentation and code-samples deteriorated to the point of no use.
This beautiful session shows how to create an AVPlayerView and set it up for displaying CI-filtered video - but only for an "asset" (meaning - dead video from disk). How to do it for live "Preview" video?
Also, is it too much to ask for a simple working order modern MacOS sample of CoreImage use?
Thanks!
(BTW I would be glad to move to MTKView instead- have I known how to tie the camera input to it. Again - Docs are unusable, and no reasonable sample exists for MacOS. Not even for the simplest of tasks.