I can't get CoreImage to render an HDR image file with correct colors to a CAMetalLayer on macOS 14. I'm comparing the result with NSImageView and the SupportingHDRImagesInYourApp 'HDRDemo23' sample code, which use CVPixelBuffer. With CAMetalLayer, the images are displayed as HDR (definitely more highlights), but they're darker with some kind saturation increase & color shift.
Files I've tested include the sample ISO HDR files in the SupportingHDRImagesInYourApp sample code. Methods I've tried to render to CAMetalLayer include:
- Modifying the GeneratingAnAnimationWithACoreImageRenderDestination sample code's ContentView so it uses HDRDemo23/example-ISO-HDR-images/image_01.heic, loaded using CIImage(contentsOf:)
- Creating a test AppKit app that uses MTKView and CIRenderDestination the same way. I have NSImageView and the MTKView in the same window for comparison.
- Using CIRAWFilter > CIRenderDestination > IOSurface > MTKView/CAMetalLayer
All these methods produce the image with the exact same appearance; a dark HDR image with some saturation/color shift.
The only clue I've found is that when using the Metal Debugger on the test AppKit app, the CAMetalLayer's 'Present' shows the 'input' thumbnail is HDR without the color shift, but the 'output' thumbnail looks like what I actually see. I tried changing the color profile on the layer to various things but nothing looked more correct.
I've tried this on two Macs, an M1 Mac Studio with an LG display, and a MacBook Air M2. The MacBook Air shows the same color shift, but since it has less dynamic range overall there isn't as much difference between NSImageView and MTKView.