How do I enable HDR on Metal on iOS ?

I noticed that if I play a white clip and compare it to setting RGB to (1,1,1) in a fragment shader in Metal, the clip has higher luminance (nits). How do I get the same nits on a Metal view ?
Metal supports rendering to a drawable using the RGBA16Float and BGR10_XR_sRGB format. These allow you to render outside the typical 0.0-1.0 range of the sRGB color space and take advantage of the higher brightness of the iOS Device's P3 display (to about 1.66894).
I'd appreciate some clearer documentation on how these formats map to the display. If my input data is BT2020 PQ, should I map that into RGBA16Float? The maximum value there represents 10,000 nits; what should that be in a Metal texture? Should BT2020 colors simply be matrix-converted to sRGB without clipping, or should I be restricting content to P3 on my end? If MTLPixelFormatRGBA16Float is in linear light, it's not entirely clear if that provides sufficient precision to distinguish all values that a 10-bit integer PQ transfer curve can express. We've sort of had to guess at these things.
With Metal, you cannot render beyond the range provided by the P3 colorspace. Only video and photo can render HDR outside that range on iOS.
So, how should content that may be beyond the P3 colorspace, but also doesn't come from AVPlayer or related classes, be displayed? And content uses the P3 gamut with the PQ transfer function, how should that be displayed? It seems like AVSampleBufferDisplayLayer is meant to handle these cases, but the documentation remains unclear.
We'd like to get a better idea of what you're trying to implement.  AVSampleBufferDisplayLayer does not have any interoperability with Metal. So there is no way to use this to achieve the full brightness of the display while rendering with shaders. Only a video by itself can achieve the full brightness.

A correction to a previous reply. The XRsRGB format can express close to all the values of the BT2020 gamut, but the display cannot render all values that the XRsRGB formats support; the display clamps values around 1.25.
I have the same need of enabling HDR on iOS.

Is the output rgb values of the fragment shader linear or something else?

In the fragment shader, I've tried to implement the PQ EOTF and do the EDR scaling according to the WWDC 2019 Speech, which said the SDR should lie in the range (0,1) and the HDR should lie in the range (1,2) or (1,5). However neither the RGBA16Float nor the BGRA10_XR setting to the CAMetalLayer can achive the full brightness of the screen. The rgb values seems to be clamped to (0,1) because the brightness of 1.x looks the same as 1.0.

It was said that the wantsExtendedDynamicRangeContent can be set to enable HDR mode, but there is no such property in iOS.

So how should I enable the HDR mode in iOS?
To add more context, I am trying to implement a simple video player using Metal. Take a SDR avc stream, decode it using VideoToolBox, convert the decoded frame to a Metal texture and render it on a MTKView. The problem is a white clip looks more bright when played on the native player compared to my player. The root cause is that the even setting the pixels in the fragment shader to (1,1,1) in the MTKView is dimmer compared to the native player playback of the white clip. I am assuming that internally the native player is enabling the HDR mode. Since the "wantsExtendedDynamicRangeContent" field is not available in iOS, I am trying to see if there is any work around.
How do I enable HDR on Metal on iOS ?
 
 
Q