I use AVSimpleBufferDisplayLayer(iOS14+) / MetalLayer(iOS16) to show HDR videos in my app.The video content is displayed well, and it is easy to find the brightness is higher than SDR video.
But I found some bugs:
1.When playing the HDR video especially the content is white, the UI is darker than video content, that makes users can't see the UI clearly. I haven't found a way to increase the UI's brightness to suit the HDR video content as now SDR video content do.
2.When playing the HDR video, screenrecording makes the HDR content darker, I think it is an abnormal thing.
Post
Replies
Boosts
Views
Activity
Problem: When playing HDR video, the screen is brighter, the UI seems not only brighter but also darker! It makes the UI blurred, influence user experience! I also tried system photo app, it also has this bad case. I take photos to show you what the bad case looks like.
SDR:
HDR:
Expect: The UI’s brightness fit HDR video, just like playing SDR video.
CVPixelBuffer.h defines
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and
[0,1023] for 420YpCbCr10BiPlanarVideoRange
Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?