I have a customer who has a 2010 Mac Pro running macOS 10.14.6, this customer is trying to use my application with RAW images from their Nikon camera ( 8256 x 5504 ).
The customer gets an image at the correct size, but with only 1 row of pixels, the rest is black. When I test the images on a 2012 MBPr, it works fine, or a 2015 MacBook, it also works.
The customer also tried an image that is 8166 x 5302 and that works for them. When my app does it's processing, it logs as much information about the CIContext as possible, including [CIContext
inputImageMaximumSize] and [CIContext outputImageMaximumSize], which both report 16384 x 16384, yet the context doesn't return the results I'd expect when the images exceed 8192 x 8192 on this customers machine.
I am leaning towards the idea that the macOS is returning incorrect information and the graphic card on their Mac Pro doesn't actually support anything higher than 8192 x 8192.
If it helps, here is the log from the context.
priority: default
workingSpace: (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; Generic HDR Profile)
workingFormat: RGBAh
downsampleQuality: High max sizes, in{16384 x 16384} out{16384 x 16384}
Any ideas or suggestions?
To create the context it calls [CIContext context].
To create the data it calls [CIContext render:toBitmap:rowBytes:bounds:format:colorSpace:]