CIRawFilter And Values Greater Than 1.0

I'm currently trying to extract the pure linear data from a decoded RAW, without any sort of tone curve. I'm using the `linearSpaceFilter` key in CIRawFilter, and finding that in scenes with significant dynamic range, the brighter values (e.g. clouds) are greater than 1.5 in the linear kernel.


Is there an explanation for this behavior? My current theory is that CIRawFilter picks a default ramp that displays pleasing results on a SDR display, similar to why it has a default boost curve.


What does this exposure ramp look like?

Replies

What does "exposure ramp" mean outside the context of time lapses? Are you referring to the gamma curve?

When you load your RAW file with CIRAWFilter, try setting boostAmount to 0. It's 1 by default. From my experience, this gives a linear response when changing exposure.

You can still get values greater than 1 if, for example, you increase the exposure enough. Try setting baselineExposure and exposure to 0.

In this era of EDR and HDR, I wouldn't consider values greater than 1 to be anything abnormal. If you don't want values greater than 1, you can clamp or otherwise map them to less than or equal to 1 when rendering. I suggest you avoid doing any kind of color value clipping in your core image processing pipeline (i.e. CIFilters and custom kernels).

Does this answer your question?