I'm playing with CIImage and CIFilter to create some custom image processing. I'm on macOS 10.13.6 and using XCode 10.1.
I can't work out this problem... the CIAreaMinimum and CIAreaMaximum filters (that return a single pixel image with the max/min colors from the CIImage) don't always produce a sensible output.
It seems that for any PNG image with transparency (first read into a CGImage, and using this to create a CIImage), the max is always black and the min is always white. For any other image (including other PNGs), it seems to produce expected results.
If I can't get this working, an alternative is for me to write a custom CIFilter (since I need some other custom stuff anyway), in which case how do I, in the first instance, go about make my own max/min filter?
I'm guessing I need a way to store min/max values, and then for each pixel sample check the min/max against the pixel value and update the min/max when appropriate, but how do I create such a persistent value to pass in/out of the custom CIFilter using Core Image Kernel Language (not Metal) and Swift?