FB15339086
Post
Replies
Boosts
Views
Activity
The app must also have an active AVCaptureEventInteraction in order for the Camera Control to work. This allows presses of the Camera Control to initiate capture.
Also be aware that captureSession.supportsControls seems to crash (unrecognized selector sent to instance) on some devices that don't have a Camera Control. Very helpful.
Unfortunately, this is not currently possible. See the discussion here.
@gchiste I looked into this some more today and was able to create a reproducible example with a few modifications to the AVCam sample code. I attached this example project in my report (FB 13683293).
The key factor for reproduction (apart from the manual configuration described in my original post) is to have an AVCaptureVideoDataOutput attached to the session. On iOS 17.3, this makes no difference and 48 MP images are captured regardless of the presence or absence of the AVCaptureVideoDataOutput. On iOS 17.4, the presence of the video data output causes photos to revert to 12 MP, and removing the AVCaptureVideoDataOutput allows 48 MP capture once again.
At the very least we should have access to Xcode 15.0 in Xcode Cloud, but no, it has mysteriously disappeared and the only option is the broken Xcode 15.0.1.
Finally figured out the cause. Behind the scenes, iOS and MacOS apply a strong tone curve to RAW images. Many default methods of loading and displaying an image (including those used in the stock apps) don't allow you to change this setting.
I imagine that this tone curve is appropriate for ProRAW images, but breaks earlier RAW formats. Unfortunate that it's applied indiscriminately in most Apple software.
However, there is a way to display these RAW images properly in your own apps. Just load data into a CIRAWFilter and set the boostAmount to 0.
No, this is a question about developing software. I am trying to see if there is a way to use AVCapturePhotoOutput to capture RAW photos in a way that isn't misinterpreted by Photos and Preview.
Thank you for the context @galad87!
I did suspect issues with the presentation timestamp, but unfortunately have been unable to recreate the issue by artificially messing with the timestamps. The relevant method is
append(_ pixelBuffer: CVPixelBuffer, withPresentationTime presentationTime: CMTime) on AVAssetWriterInputPixelBufferAdaptor.
From my experimentation, artificially delaying a timestamp just causes a delay in the footage (somewhat obvious), while setting timestamps in the past just causes the writer to fail since AVAssetWriter expects frames in order. I have not been able to find a combination of timestamps to force the reordering behavior described before. Something could definitely still be wrong with our implementation, but I am starting to wonder if something is going on with AVAssetWriter internally.