Hi
I’ve sort of asked this before but got no response. I’ve been on this now for two days and am getting nowhere.
Basically, as a quick test, I added some code to capture the photoData just before the ”didFinishCaptureFor” func writes the image to the Photos library. Then a button on the camera VC that loads the data into an image view (pv) on the same VC as follows:
let dataProvider = CGDataProvider(data: photoTest.photo! as CFData)
let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
let photo = UIImage(cgImage: cgImageRef)
pv.contentMode = .scaleAspectFill
pv.image = photo
The image that is saved to the Photos library is correctly orientated, while the one previewed is not.
I understand about how the image is captured etc., and have read so many posts that explain why, but I can’t find an example that works where the image is either previewed as above, or saved to the app folder as opposed to the Photos library.
This post https://forums.developer.apple.com/thread/102846 has a response by bford that states “When you write the buffer to any of the standard still image containers (JPEG, HEIC, DNG, TIFF, etc), the image is rotated and/or flipped appropriately at display time.”.
If this is the case then I’m clearly missing some code as I’ve not modified the AvCode, simply added a preview, yet the image is still not displayed properly.
Note that the only reason I need custom camera code is that I need to record the location and direction the camera was pointed when the photo was taken. The image picker doesn’t return location data and ultimately I don’t want to have to click “use photo”, rather just save every one.
Thanks in advance