How to get frame times and/or stepping through video?

For evaluation of physical phenomena recorded on video we need to know the exact timestamps of the recorded frames and stepping frame by frame.

Converting from frame number to timestamp using the framerate did not work because in many videos recorded with iOS cameras the frame rate slightly jitters. Using seekToTime and equivalent methods therefore caused skipping oder doubling of frames.


How is it possible to get the exact time(stamps) of every frame in a video using AVFoundation and stepping/seeking through a video on per-frame basis?


AVPlayerItem seems to be the only way I found so far to step on per-frame basis, but is it the right choice?

Is there any other method?

Answered by Stoneage in 126841022

Use the AVAssetReader and step through with copyNextSampleBuffer. It will give you CMSampleBuffers that containt the exact time:


CMSampleBufferRef sampleBuffer = [asset_reader_output copyNextSampleBuffer];

CMTime presTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer);

double frameTime = CMTimeGetSeconds(presTime);

Perhaps you can use AVSampleCursor? See technote 2404

Looks interesting. Unfortunately I need to do this on iOS, where AVSampleCursor seems not (yet) to be available.

Sorry for not mentioning that requirement earlier.

Is there really no way to do this on iOS?

Accepted Answer

Use the AVAssetReader and step through with copyNextSampleBuffer. It will give you CMSampleBuffers that containt the exact time:


CMSampleBufferRef sampleBuffer = [asset_reader_output copyNextSampleBuffer];

CMTime presTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer);

double frameTime = CMTimeGetSeconds(presTime);

If somebody else has the same problem, here is the complete code for the solution:


import Foundation
import AVFoundation

let videoFile = NSURL(fileURLWithPath: "YourVideofile.mov")

let asset = AVAsset(URL:videoFile)
let track = asset.tracksWithMediaType(AVMediaTypeVideo)[0]
let output = AVAssetReaderTrackOutput(track: track, outputSettings: nil) // nil gets original sample data without overhead for decompression
guard let reader = try? AVAssetReader(asset: asset) else {exit(1)}
output.alwaysCopiesSampleData = false // possibly prevents unnecessary copying?
reader.addOutput(output)
reader.startReading()

var frameNumber = 1
while(reader.status == .Reading){
    if let sampleBuffer = output.copyNextSampleBuffer() where CMSampleBufferIsValid(sampleBuffer) && CMSampleBufferGetTotalSampleSize(sampleBuffer) != 0{
        let frameTime = CMSampleBufferGetOutputPresentationTimeStamp(sampleBuffer)
        if (frameTime.isValid){
            print("frame: \(frameNumber), time: \(String(format:"%.3f", frameTime.seconds)), size: \(CMSampleBufferGetTotalSampleSize(sampleBuffer)), duration: \(                CMSampleBufferGetOutputDuration(sampleBuffer).value)")
            frameNumber += 1
        }
    }
}

I have this running and it works great with standard video 30 fps. How would you change it to step though slow motion video 120 fps or 240 fps?

Since frames in H264 streams may be reordered, it seems necessary to sort the resulting array:


        timestamps.sort{CMTimeCompare($0, $1) < 0}

Is it possible to do a `copyNextSampleBuffer` from a particular index of my choice? Seems like the index is always hidden from the API. I want to use copyNextSampleBuffer because its faster but I want to be able to do it from a specific index as well(even though the first frame after jump will be slow). This is in the scenario of where a user moves the slider on my video player to a certain position and I playback from that particular point

How to get frame times and/or stepping through video?
 
 
Q