Is it possible to use AVCaptureVideoDataOutput + AVCaptureMovieFileOutput

Hi,


I have a requirement to capture 2 video outputs


AVCaptureVideoDataOutput:

To capture the the pixel buffers


AVCaptureMovieFileOutput:

To save the to a file with the same capture session


Is it possible to do this ?

The reason I ask is because I found this on StackOverflow:


https://stackoverflow.com/questions/3968879/simultaneous-avcapturevideodataoutput-and-avcapturemoviefileoutput


Also do I require a live preview while capturing to a file ?

Are there any other options to do this ?

Accepted Reply

You can't use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput on the same time. But you can use AVCaptureVideoDataOutput and analyse or modify on the data, then use AVAsseWriter to write the frames to a file.

Replies

You can't use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput on the same time. But you can use AVCaptureVideoDataOutput and analyse or modify on the data, then use AVAsseWriter to write the frames to a file.

Thank you I've exactly done that.

Hi, I'm wanting to capture video using AVCaptureVideoDataOutput and then, like you, use AVAssetWriter to write the frames to a file. I'm very new to AVFoundation framework and can't find much help online. Can you provide me with any help as to how you did it? Currently I've set up my AVCaptureSession and have a live preview being displayed. I just need to find out how to get the AVAssetWriter working to save the frames to file.

Does this sample code help?


https://developer.apple.com/library/archive/samplecode/AVLocationPlayer/Introduction/Intro.html


Although the point of this sample is the location metadata, it does also show how to get the audio and video stream samples from the capture into the asset writer.

Hi thanks for you help. I was able to do what I needed by using this article as a guide- https://geek-is-stupid.github.io/2017-06-14-how-to-record-detect-faces-overlay-video-at-real-time-using-swift/