I'm currently trying to extract the pure linear data from a decoded RAW, without any sort of tone curve. I'm using the `linearSpaceFilter` key in CIRawFilter, and finding that in scenes with significant dynamic range, the brighter values (e.g. clouds) are greater than 1.5 in the linear kernel.
Is there an explanation for this behavior? My current theory is that CIRawFilter picks a default ramp that displays pleasing results on a SDR display, similar to why it has a default boost curve.
What does this exposure ramp look like?