According to the Core Image documents the following API should return the RAW images native output size.
let filter = CIFilter(imageURL: url, options: nil)
let value = filter.value(forKey: CIRAWFilterOption.outputNativeSize.rawValue ) as? CIVector
However this seems to always return the camera's native resolution rather than the actual image size contained in the RAW file.
For example if I uses this API on a RAW file that was shot at 16:9 aspect ratio on a Sony 19 the image size should be 6000 x 3376 but this API call returns 6000 x 4000.
Is this a bug or am I missing something - is there another API call to get the actual image size ?
Note that the EXIF data does contain the correct image size.
let filter = CIFilter(imageURL: url, options: nil)
let value = filter.value(forKey: CIRAWFilterOption.outputNativeSize.rawValue ) as? CIVector
However this seems to always return the camera's native resolution rather than the actual image size contained in the RAW file.
For example if I uses this API on a RAW file that was shot at 16:9 aspect ratio on a Sony 19 the image size should be 6000 x 3376 but this API call returns 6000 x 4000.
Is this a bug or am I missing something - is there another API call to get the actual image size ?
Note that the EXIF data does contain the correct image size.