5 Replies
      Latest reply on Feb 6, 2020 11:06 AM by FrankSchlegel
      FxFactory Level 1 Level 1 (0 points)

        I would like to be able to write CIKernels that output a lookup table or other blob of data, to be used by subsequent CIKernels.

         

        The data generated by such a kernel is fed to subsequent kernels whose corresponsing sampler input has been tagged with the "__table" attribute to disable colormatching for that input. This is a scenario that already works if one has the ability to allocate the CIImage himself, so that a nil colorspace can be passed. But when asking CIKernel for an output image, it is not possible to request an image without a colorspace associated with it. I'm referring to methods like this:

         

        - applyWithExtent:roiCallback:arguments:

         

        There are also no APIs in Core Image that would allow you to strip colorspace information from an existing image, AFAIK. (As in "hey I know this recipe is tagged with the working/output colorspace, but in reality it contains values that do not encode RGB at all")

         

        If I feed the output image from applyWithExtent:... to another kernel whose sampler has the __table attribute, from my observations it still appears to be colormatched. I can see three possibilities:

        1) I am clearly missing something.

        2) The __table attribute no longer has the desired effect, perhaps a regression.

        3) A new API is needed to cover this usage scenario.

         

        Any help is greatly appreciated!

         

        Best,

        Gabe

        • Re: Any way to let CI know output of CIKernel should not be colormatched?
          FrankSchlegel Level 3 Level 3 (120 points)

          It's true, Core Image assumes that all intermediate results are in the working color space of the CIContext.

           

          You could try to wrap your lookup table in a CISampler with color space set to Null before you pass it to the kernel. This should tell Core Image to not do color-matching on that image:

          let sampler = CISampler(image: image, options: [kCISamplerColorSpace: NSNull()])
          kernel.apply(extent: domainOfDefinition, roiCallback: roiCallback, arguments: [sampler, ...])

           

          Please let me know if this works!

            • Re: Any way to let CI know output of CIKernel should not be colormatched?
              FxFactory Level 1 Level 1 (0 points)

              Hi Frank,

               

              Unfortunately it does not work. It seems that the NSNull value in the dictionary is simply ignored. The resulting sampler still thinks it should color match (my input image is one of the desktop pictures that ships with Catalina, hence the Display P3 profile):

               

              <CISampler: 0x60000000c980>

                samplemode linear extent=[0 0 6016 6016][-1 -1 6018 6018]

                  affine [1 0 0 -1 0 6016] extent=[0 0 6016 6016] opaque

                    colormatch DisplayP3_to_workingspace extent=[0 0 6016 6016] opaque

                      CGImageRef 0x104e14460(717) RGBX8 6016x6016  alpha_one extent=[0 0 6016 6016] opaque

               

              Any other ideas?

                • Re: Any way to let CI know output of CIKernel should not be colormatched?
                  FrankSchlegel Level 3 Level 3 (120 points)

                  Oh, what you see in the debug output needn't be the truth. That's because at the time you inspect/print the image, it doesn't "know" which CIContext is going to render it. That's why it's always inserting a generic "I'll convert to working space here" step in there, assuming it's needed. But if your context doesn't do color matching (workingColorSpace set to Null) or the input already is in working space, it's not actually performed.

                   

                  I recommend you change your final render call to use the newish CIRenderDestination API (check out CIRenderDestination and CIContext.startTask(toRender:...)). This will give you access to a CIRenderTask object that you can call waitUntilCompleted on to get a CIRenderInfo object. You can Quick Look both objects in the debugger to get a nice graph showing you what is actually done during rendering.

                   

                  Maybe you can post them here and we'll figure out what we need to change.

                    • Re: Any way to let CI know output of CIKernel should not be colormatched?
                      FxFactory Level 1 Level 1 (0 points)

                      Hi Frank,

                       

                      Thanks for all the help. Unfortunately disabling color correction at the CIContext level doesn't quite solve the problem I was set to solve. Our overall image computation does need color management. It is only lookup tables to store intermediate results that shouldn't go through the color conversion process. Someone did point out to me that it has been a very long time that the __table qualifier hasn't had any meaning to Core Image. It is parsed without errors, but it is ignored since the Core Image implementation that shipped with OS X 10.11. This is noted in the Core Image kernel language docs.

                       

                      It is also possible to sort-of disable color correction for a specific input to a kernel if you happen to know what color space its values were encoded in. You simply pass that same color space to kCISamplerColorSpace, turning the color management pass to a no-op.

                       

                      And though this does not match my usage scenario, you can also evaluate (render) a CIImage to a surface (IOSurface, Metal texture, etc.), which gives you a chance to re-create another CIImage from that data and omit the colorspace. We’ve used render destinations and the latter technique before, but the AFAIK the API really does seem to lack a clean way to accomplish what many of Apple’s own built-in kernels are doing, that is: use kernels to create intermediate results that do not encode RGB values, to be consumed by later kernels.

                        • Re: Any way to let CI know output of CIKernel should not be colormatched?
                          FrankSchlegel Level 3 Level 3 (120 points)

                          Yeah, I also couldn't find any equivalent for the __table attribute in the "new" Metal-based CIKernel language—which is favored by Apple, by the way, since the old CI Kernel Language is deprecated now.

                           

                          What I was suggesting is that you use the destination API for your final rendering step to check the rendering graph to see what CI is doing under the hood with different color management settings. It would also be interesting to see the render graph when you use one of their non-image-producing kernels and check if color management happens there.

                           

                          By the way, in the iOS 13 release notes I found the following sentence:

                          Metal CIKernel instances support arguments with arbitrarily structured data.

                          However, I was not able to find any documentation or examples for this. Maybe that's something you can leverage.

                          There is, for instance, the CIColorCube kernel that gets passed an NSData containing the lookup table. Maybe that's the way.

                           

                          Please let me know if you find a way!