Recording video for 240FPS with filters

I use AVAssetWriter to record a video for 240FPS to make slow motion, but the frame rate ruduced gradually , at last the frame rate keep about 60FPS, why?

How can I record a video for 240FPS with filters? The time of every frame received in AVCaptureVideoDataOutputSampleBufferDelegate

as below.


{136730920270041/1000000000 = 136730.920}

{136730924439541/1000000000 = 136730.924}

{136730928608833/1000000000 = 136730.929}

{136730932778416/1000000000 = 136730.933}

{136730936947791/1000000000 = 136730.937}

{136730941117166/1000000000 = 136730.941}

{136730945286666/1000000000 = 136730.945}

{136730949456125/1000000000 = 136730.949}

{136730953625416/1000000000 = 136730.954}

{136730957794958/1000000000 = 136730.958}

{136730961964541/1000000000 = 136730.962}

{136730966133833/1000000000 = 136730.966}

{136730970303416/1000000000 = 136730.970}

{136730974472833/1000000000 = 136730.974}

{136731191283708/1000000000 = 136731.191}

{136731207961375/1000000000 = 136731.208}

{136731278841791/1000000000 = 136731.279}

{136731295519625/1000000000 = 136731.296}

{136731316366750/1000000000 = 136731.316}

{136731333044749/1000000000 = 136731.333}

{136731349722290/1000000000 = 136731.350}

{136731366400207/1000000000 = 136731.366}

{136731383077916/1000000000 = 136731.383}

{136731399755750/1000000000 = 136731.400}

{136731416433374/1000000000 = 136731.416}

{136731433111082/1000000000 = 136731.433}

{136731449789000/1000000000 = 136731.450}

{136731466466583/1000000000 = 136731.466}

{136731483144540/1000000000 = 136731.483}

{136731499822207/1000000000 = 136731.500}

{136731516499999/1000000000 = 136731.516}

{136731533177666/1000000000 = 136731.533}

{136731549855500/1000000000 = 136731.550}

{136731566533374/1000000000 = 136731.567}

{136731583211040/1000000000 = 136731.583}

{136731599888833/1000000000 = 136731.600}

{136731616566500/1000000000 = 136731.617}

{136731633244374/1000000000 = 136731.633}

Answered by thomas.pintaric in 171686022

No, AVCaptureVideoPreviewLayer (through its base class CALayer) can only use CoreImage filters on OSX, but not on iOS. If you need to filter your real-time preview, you'll have to stick with AVCaptureVideoDataOutput.

You probably need to post more details (e.g. the contents of your delegate callback, how you set up your AVCaptureSession) in order to get any meaningful feedback.


Did you check if any of the samples are being dropped?

Did you enable the alwaysDiscardsLateVideoFrames property on your AVCaptureOutput?


As for the timestamps, I'd suggest using CMSampleBufferGetPresentationTimeStamp(sampleBuffer):


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"timestamp = %f", CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)));
}

Turn off your filtering code and just write to AVAssetWriter. Do you get 240 fps? If so, your filtering code is too slow. Also, make sure alwaysDiscardsLateVideoFrames is NO — you may see some lag if you're previewing your filtered frames, but this is necessary if you're going to write them to asset writer.

I`ve already disable the filter code, but it is not work too, I even didn`t start recording ,just previewing the view.I even closed the audio output code.

I`ve already set alwaysDiscardsLateVideoFrames to NO and print the timestamp at the beginning of AVCaptureVideoDataOutputSampleBufferDelegate delegate method.


Preview is a subclass of GLKView and there is a setImage method to render the image:
 PreviewView : GLKView



- (void)setImage:(CIImage *)sourceImage {
    [self bindDrawable];
    if (!self.isHighFPS) {
        [self.filter setValue:sourceImage forKey:kCIInputImageKey];
        CIImage *filteredImage = self.filter.outputImage;
        if (filteredImage) {
            CGRect cropRect =
            THCenterCropImageRect(sourceImage.extent, self.drawableBounds);
            [self.coreImageContext drawImage:filteredImage
                                      inRect:self.drawableBounds
                                    fromRect:cropRect];
        }
    }else{
        [self.coreImageContext drawImage:sourceImage
                                  inRect:self.drawableBounds
                                fromRect:sourceImage.extent];
    }
  [self display];
    if (!self.isHighFPS) {
        [self.filter setValue:nil forKey:kCIInputImageKey];
    }
    [self deleteDrawable];
}
The Camera is a delegate of  AVCaptureVideoDataOutputSampleBufferDelegate
Camera : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
- (BOOL)setupSessionOutputs:(NSError **)error {
    self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];         
    NSDictionary *outputSettings =
        @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
    self.videoDataOutput.videoSettings = outputSettings;
    self.videoDataOutput.alwaysDiscardsLateVideoFrames = NO;                
    [self.videoDataOutput setSampleBufferDelegate:self
                                            queue:self.dispatchQueue];
    if ([self.captureSession canAddOutput:self.videoDataOutput]) {
        [self.captureSession addOutput:self.videoDataOutput];
    } else {
        return NO;
    }
    NSString *fileType = AVFileTypeQuickTimeMovie;
    NSDictionary *videoSettings =
        [self.videoDataOutput
            recommendedVideoSettingsForAssetWriterWithOutputFileType:fileType];
   
    NSDictionary *audioSettings =
        [self.audioDataOutput
            recommendedAudioSettingsForAssetWriterWithOutputFileType:fileType];
    self.movieWriter =
        [[THMovieWriter alloc] initWithVideoSettings:videoSettings
                                       audioSettings:audioSettings
                                       dispatchQueue:self.dispatchQueue];
    self.movieWriter.delegate = self;
    return YES;
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection {
    CMTime timeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTimeShow(timeStamp);
    [self.movieWriter processSampleBuffer:sampleBuffer];
    if (captureOutput == self.videoDataOutput) {

        CVPixelBufferRef imageBuffer =
            CMSampleBufferGetImageBuffer(sampleBuffer);

        CIImage *sourceImage =
            [CIImage imageWithCVPixelBuffer:imageBuffer options:nil];

        [self.imageTarget setImage:sourceImage];//imageTarget is a instance of PreviewView
    }
}

I don't think that's the most efficient way to display a preview image. Are you calling setImage: on every sample buffer update (e.g. 240Hz?). If so, remember that the display refresh rate is only 60Hz, so most of your drawing commands are wasted. Why not just use AVCapturePreviewLayer instead of your custom PreviewView?

Turn off everything in your captureOutput:didOutputSampleBuffer:fromConnection: callback. Just log the timestamps. You should be getting full frame rate. I suspect your preview code is too slow. There are many samples online that show faster ways to draw preview (RosyWriter, VideoSnake, etc).

You are right, after I turned off everything.Then I got 240 FPS datas, thank you so much

oh, thank you for you advice, I`ve forget the refresh rate.I am doing a realtime camera with filters. So maybe I should custom a preview my own. Can AVCapturePreviewLayer preview with CIFilters?

Accepted Answer

No, AVCaptureVideoPreviewLayer (through its base class CALayer) can only use CoreImage filters on OSX, but not on iOS. If you need to filter your real-time preview, you'll have to stick with AVCaptureVideoDataOutput.

Hello, I have met the same question metioned above.
I'm wondering how to fetch the data generated on 240fps and save it to the memory.
If it is convenient, could you please share your corresponding code.
Thank you for your time and consideration.
Recording video for 240FPS with filters
 
 
Q