What is the correct way to get a RAW files image size

According to the Core Image documents the following API should return the RAW images native output size.

let filter = CIFilter(imageURL: url, options: nil)
let value = filter.value(forKey: CIRAWFilterOption.outputNativeSize.rawValue ) as? CIVector

However this seems to always return the camera's native resolution rather than the actual image size contained in the RAW file.

For example if I uses this API on a RAW file that was shot at 16:9 aspect ratio on a Sony 19 the image size should be 6000 x 3376 but this API call returns 6000 x 4000.

Is this a bug or am I missing something - is there another API call to get the actual image size ?

Note that the EXIF data does contain the correct image size.
It seems that the RAW file contains the native camera image size (sensor size) and the thumbnail contains and image that is set to the aspect ration setting that was used on the camera. So when querying the RAW files thumbnail you get the thumbnail stored by the camera but when getting the CIImage using CIFilter you get the full image i.e. native camera sensor image size.

There seems to be no RAW API to get the camera set aspect ratio but this data does seem to exist in the EXIF data XDimension and YDimension.
What is the correct way to get a RAW files image size
 
 
Q