CIContext reports that it supports images in and out up to 16384 x 16384, but I don't think that it does.

I have a customer who has a 2010 Mac Pro running macOS 10.14.6, this customer is trying to use my application with RAW images from their Nikon camera ( 8256 x 5504 ).


The customer gets an image at the correct size, but with only 1 row of pixels, the rest is black. When I test the images on a 2012 MBPr, it works fine, or a 2015 MacBook, it also works.


The customer also tried an image that is 8166 x 5302 and that works for them. When my app does it's processing, it logs as much information about the CIContext as possible, including [CIContext

inputImageMaximumSize] and [CIContext outputImageMaximumSize], which both report 16384 x 16384, yet the context doesn't return the results I'd expect when the images exceed 8192 x 8192 on this customers machine.


I am leaning towards the idea that the macOS is returning incorrect information and the graphic card on their Mac Pro doesn't actually support anything higher than 8192 x 8192.


If it helps, here is the log from the context.

              priority: default
              workingSpace:  (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; Generic HDR Profile)
              workingFormat: RGBAh
              downsampleQuality: High max sizes, in{16384 x 16384} out{16384 x 16384}


Any ideas or suggestions?


To create the context it calls [CIContext context].

To create the data it calls [CIContext render:toBitmap:rowBytes:bounds:format:colorSpace:]

Replies

Stupid forum software cut off the first line from the log report.


<CIContext: 0x600003c57100 (metal-DG 738) MTLDevice=0x10d019000>

priority: default

workingSpace: <CGColorSpace 0x600001a44720> (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; Generic HDR Profile)

workingFormat: RGBAh

downsampleQuality: High max sizes, in{16384 x 16384} out{16384 x 16384}

Hi Sam,


I can only guess, but is this maybe an issue that is caused by tiling?

If an image exceeds a certain size, Core Image will process the image in tiles and stitch them back together in the end. Maybe some filter in the chain doesn't handle this case very well (e.g., by supplying the wrong region of interest)?


Have you tried processing a very large (close to 16384 pixels) image on your machine? That should trigger the tiling even on newer hardware.

Hi Frank,

Your point on tiling might well be spot on, while for this test I am not doing any tiling myself, I wonder if it's CoreImage's own tiling system is incompatible with the 2010 Mac Pro. I have no idea what resolution Core Image auto tiles or the size of the tiles. I wonder if I try manually tiling the image via a CIImageAccumulator (like I did for my complex filters) if that would help.


I am not doing anything fancy with Core Image for this test, simply creating a CIImage from a CGImage, applying the CILinearToneCurveTosRGB and then getting the pixel data back via [CIContext render:toBitmap:rowBytes:bounds:format:colorSpace:] I have also tried creating a CIContext from a CGBitmapContext and drawing the CIImage into the CIContext (so I can get at the RAW pixels), but that yields the same result, 1 row of pixels and the rest is black.


I am amazed that this customer has the patience to persist with me, while I been trying a great many things, but no luck in solving it for them (doesn't help that I am not able to reproduce it here).


These kind of issues with Core Image are simply driving me nuts.

The strangest thing is that one row of pixels is there...


Are you sure rowBytes and the other properties are computed correctly?


Also, if you just want to achieve linear -> sRGB, you don't need any CIFilter. Just use the sRGB color space when rendering the image. Core Image will perform the conversion automatically.

> The strangest thing is that one row of pixels is there...

Agreed


> Are you sure rowBytes and the other properties are computed correctly?

I am pretty certain ( although never 100% ), the numbers seem right and it works on the two machines I have to hand that run the same OS version as the customer, with the customers images. It works for the customer when using images just shy of 8192 in width.


> Also, if you just want to achieve linear -> sRGB, you don't need any CIFilter. Just use the sRGB color space when rendering the image. Core Image will perform the conversion automatically.

It's my understanding that this only changes the color profile, for this step I'm doing a tone curve adjustment. I can get close by using pow(), but I don't know the exact ratios, I also want to try to avoid changing the look of images unless I am left with zero choice.


If I knew what the exact tone curve adjustments are, I would drop Core Image for this process in a heartbeat, I have everyhing else working in C++ code and using GCD, there is very little perceived performance penalty (at the moment).

> It's my understanding that this only changes the color profile

Not only, it should also convert to the target color space. Maybe you can try to omit the filter?

Thanks Frank,

I'll have a play with it as soon as I can get back into working on this project.


Any ideas (apart from Tiling) for what I can try if this doesn't solve the problem? I am thinking that there's potentially something wrong with the install of macOS 10.14.6 on this customers machine and that his GPU driver is borked.


However I am very hesitant to ask the customer to re-install his OS, incase it's not or for it to cause the customer more problems. Which is why for this function, I've been gradually replacing the Core Image with C++ via GCD, the conversion to the sRGB Tone Curve is the last function that I have to convert.

While it might be possible, it seem unlikely that some driver error is causing this problem. But I also don't know how to solve it.


I did some quick research and found that you can do gamma correction in vImage. Check out the functions starting with vImageGamma. They even have the sRGB gamma function (kvImageGamma_sRGB_forward_half_precision). This should be the most efficient alternative to Core Image.