As a general rule it’s best to use the highest-level API that meets your needs. In this case Core Image is the highest-level API of the set you mentioned and, while I’m hardly a graphics expert, I believe it can meet your requirements.
Right, I'm just not entirely sure what that is in this particular case, which is processing (getting mean, variance, histogram, etc) perhaps thousands of images in certain situations. The closest that Apple ever seems to get with this kind of thing is "CIContext is expensive; reuse it instead of tearing it down and recreating it every time if you have multiple images."
So I don't really know if I need to use MPS to better take advantage of systems with lots of GPU resources, if CIContext/CoreImage will do that just fine. I know each route fairly well -- Core Image in particular -- it's just that nearly all my experience is in the oligo- case and not the mass-production case.
Post
Replies
Boosts
Views
Activity
I'm getting faster performance (MacBook Pro macOS Monterey Beta 3), but it's hard to tell because it's all Swift symbols. It would have been really nice had Apple provided source that actually matched what was shown in the video. Short of exporting all the symbol names to a text file via the SF Symbols app and then reading them in and building and array of symbols that way (yuck), there doesn't seem to be a way to easily enumerate them.
I have no idea why everything feels so cagey when they talk about SwiftUI. It always seems to be something in the way.