The documentation says the pixel format of the environmentTexture
in an AREnvironmentProbeAnchor is bgra8Unorm_srgb
.
https://developer.apple.com/documentation/arkit/arenvironmentprobeanchor/2977511-environmenttexture
However, when I inspect the pixelFormat
property of the MTLTexture
it says it's rgba16Float
.
I'm trying to read the texture out as a PNG, and because it's a 16-bit float image, I'm assuming its color space is CGColorSpace.displayP3
, but I'm not 100% sure. The texture looks darker than what I expected.
Could it be that the color space is sRGB, but it's 16-bit because it's actually an HDR texture stored as linear RGB?
(Tested on iPhone 12, iOS 15)
It looks like the color space is CGColorSpace.linearSRGB
. It's an HDR image, according to the paper we found:
https://machinelearning.apple.com/research/hdr-environment-map-estimation
outputs a completed environment map that is higher dynamic range (HDR, 16 bit channel)
I've updated the color space, saved it as PNG, and the output looks fine.
I'm assuming when ARKit2 was out, the environment textures were 8-bit at the time, but they've been upgraded since.