(Context: macOS Ventura 13.3)
What is the simplest/least resource-intensive way to enable the display of HDR content in an NSImageView
in an AppKit application?
I've found that, as the documentation for wantsExtendedDynamicRangeContent suggests, NSImageView
faithfully presents HDR images (loaded from HDR formats such as 10-bit HEIC) whenever any on-screen CAMetalLayer
has the aforementioned property to to true
. This is so even if another app on the same screen has set wantsExtendedDynamicRangeContent
to true
on a CAMetalLayer
. Conversely, if no such layer exists on-screen, the NSImageView
clips HDR highlights.
For example, suppose you have an app with an NSImageView
displaying a 10-bit HEIC image. And suppose you have Affinity Photo 2 with the "Enable EDR by default in 32bit RGB views" option enabled in the AP 2 preferences. If you load an image with Affinity Photo 2 (apparently any image, even an 8-bit JPEG), you will find that the image displayed by your NSImageView
will render HDR highlights on an EDR-enabled display. You don't need to change any setting on NSImageView
for this to work.
It seems rather awkward and inefficient to add something like a 1 or 0 pixel-sized CAMetalLayer
to my app solely to enable EDR display. Since EDR content display is screen-based, it seems there should be some kind of property on NSScreen
to enable EDR content. And clearly there's some kind of magic happening behind the scenes of NSImageView
to display EDR content when EDR display is enabled on the screen.
Any advice?