On a Vision Pro I load an HDR video served over HLS using AVPlayer.
Per FFMPEG the video has:
- pixel format: yuv420p10le
- color space / ycbcr matrix: bt2020nc
- color primaries: bt2020
- transfer function: smte2084
I wanted to try out letting AVFoundation do all of the color conversion instead of making my own YUV -> RGB shader.
To display a 10-bit texture in a drawable queue, the destination Metal texture format must be MTLPixelFormat.rgba16Float (no other formats above 8-bit are supported). So the pixel format I am capturing in is kCVPixelFormatType_64RGBAHalf since it's pretty close.
It's worth noting that the AVAsset shows no track information...must be because it's HLS?
I am using AVPlayerItemVideoOutput to get pixel buffers:
AVPlayerItemVideoOutput(outputSettings: [
AVVideoColorPropertiesKey: [
AVVideoColorPrimariesKey: AVVideoColorPrimaries_ITU_R_2020,
AVVideoTransferFunctionKey: AVVideoTransferFunction_SMPTE_ST_2084_PQ,
AVVideoYCbCrMatrixKey: AVVideoYCbCrMatrix_ITU_R_2020
],
kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_64RGBAHalf),
kCVPixelBufferMetalCompatibilityKey as String: true
])
I can change these settings in real time and see they are having an effect on my drawable queue.
The BT.2020 primaries do not look correct to me, it's very bright and washed out. When I switch to BT.709 it looks closer to the output of the AVPlayer. The AVPlayer by itself doesn't look terrible, just a little dark maybe.
When I leave out the outputSettings and let the AVPlayerItemVideoOutput choose its own color settings, it appears to choose BT.2020 also.
-
Is it enough to put in these outputSettings and expect an RGB pixelBuffer that perfectly matches those settings? Or do I have to just capture in YUV and do all of the conversion manually?
-
Am I misunderstanding something related to color settings here? I am definitely not an expert.
Thanks