I notice that when I open the Photos app on my iPhone 12 Pro, viewing Photos or Videos shot in HDR makes them brighter than the overall display brightness level.
On macOS, there are APIs like EDRMetadata on CAMetalLayer and maximumExtendedDynamicRangeColorComponentValue on NSScreen.
I did see
CAMetalLayer.wantsExtendedDynamicRangeContent, but I'm not sure if this does what I'm looking for.
The Using Color Spaces to Display HDR Content documentation page describes setting the .colorspace on the CAMetalLayer for BT2020_PQ content, but it's not clear if this is referring to macOS or iOS. Is that the right way to get colors to be "brighter" than 1.0 on "XDR" mobile displays?
On macOS, there are APIs like EDRMetadata on CAMetalLayer and maximumExtendedDynamicRangeColorComponentValue on NSScreen.
I did see
CAMetalLayer.wantsExtendedDynamicRangeContent, but I'm not sure if this does what I'm looking for.
The Using Color Spaces to Display HDR Content documentation page describes setting the .colorspace on the CAMetalLayer for BT2020_PQ content, but it's not clear if this is referring to macOS or iOS. Is that the right way to get colors to be "brighter" than 1.0 on "XDR" mobile displays?