Am I tagging my pixels which are linear as srgb, do I need a transfer function to be gamma 1 or gamma 2.2?
I’m not sure which of two possible things you’re referring to here, so I’ll explain both and how they interact.
There are a number of MTLPixelFormat
s which end in _sRGB
. These formats implicitly degamma when sampled, and engamma when written to. So if you set the pixelFormat
of your MTKView/CAMetalLayer to one of these _sRGB
formats, that means that the GPU will implicitly engamma your shader’s output when it writes it into the memory that backs the drawable’s texture.
The compositor reads the drawable’s memory directly; it doesn’t sample it like a texture. Thus, if you use an _sRGB
pixel format, your shader can work in linear space as long as you set your CAMetalLayer/MTKView’s colorspace
to an sRGB
colorspace. The compositor will read the engamma’d pixel values directly, and then color-match them from sRGB space to the display’s colorspace.
What is kCGColorSpaceSRGB vs. kCGColorSpaceLinearSRGB. These are both srgb colorimetry, but am I stating that my content has the srgb gamma curve in the former, and is linear in the latter?
Correct.
And how to get the cheapest pass through. I can set my display color space to sRGB (709) or P3, but how do I get the cheapest, fastest, and highest quality pixel output to screen.
If by “highest quality” you mean “accurately color-matched”, this is in direct opposition to “cheapest” and “fastest”. The fastest approach is to opt out of color matching entirely, by setting the colorspace
property to nil
. If you do want colormatching, however, you probably want to minimize the amount of work in your shaders. If you use an _sRGB
pixel format, you can avoid an explicit engamma/degamma in your shader as discussed above. If you’re working with HDR, you can either implement tone-mapping yourself, or you can use CAEDRMetadata to have the compositor tone-map for you. Which will be faster depends on your specific use case.
There is also true tone, and the user adjusting the maximum brightness of the display. So those also need to be responded too. And what to do when NSApplicationDidChangeScreenParametersNotification calls and says screen.maximumExtendedDynamicRangeColorComponentValue changed.
The effects of True Tone are invisible to your application. Normally, the brightness of the display is also invisible to your application, but this can change when you’re using an Apple display with EDR capabilities, like the Pro Display XDR.
The maximumExtendedDynamicRangeColorComponentValue
is known in UIKit as currentEDRHeadroom
. “Headroom” is the ratio of the maximum brightness the screen can display relative to “SDR white”, which is the brightness of SRGB(1.0, 1.0, 1.0)
. On a non-EDR display, SRGB(1.0, 1.0, 1.0)
always maps to the display’s “maximum brightness”, and therefore currentEDRHeadroom
is equal to 1.0. But on an EDR display (including third-party displays with HDR support), pixels can of course be brighter. If the potentialEDRHeadroom
value is equal to 4.0, that means the brightest white that the display can show is 4 times brighter than SRGB(1.0, 1.0, 1.0)
. One way to encode this particular “brightest white” is ExtendedLinearSRGB(4.0, 4.0, 4.0)
. If you implement tone-mapping yourself, you can provide this information to your tone-mapping shader, and update it whenever you get the ScreenParamatersChanged notification.