AVAssetWriter framerate slightly off

Hi, I'm writing an app that decodes a video, applies effects to each frame, then re-encodes to a new video. When I open both videos in FCPX to inspect, my source video is listed at 29.97 frames per second, while my re-encoded video is listed at 30 frames per second (making them incompatible to be swapped one for another in an FCPX timeline). I'm using the exact frame time stamps from the source video when encoding them to the destination video, so why is it incorrectly creating a 30 fps video from a 29.97 fps video? Key source code below:


When reading/decoding a frame:

CMSampleBufferRef videoSampleBuffer = [assetReaderVideoOutput copyNextSampleBuffer];
if (videoSampleBuffer == NULL)
  return false;

CMSampleTimingInfo myTiming;
CMSampleBufferGetSampleTimingInfo(videoSampleBuffer, 0, &myTiming);
frameInfo->frameTimeValue = myTiming.presentationTimeStamp.value;
frameInfo->frameTimeScale = myTiming.presentationTimeStamp.timescale;


When writing/encoding the modified frame:

CMTime myTime = CMTimeMake(frameInfo->frameTimeValue, frameInfo->frameTimeScale);
[pixelBufferAdapter appendPixelBuffer:pixelsBuffer withPresentationTime:myTime];


Ideas?

Replies

These new Apple dev forums are awful. I did find the answer to my question and posted it on Stack Overflow: http://stackoverflow.com/questions/31980851/avassetwriter-frame-rate-slightly-off/32013350#32013350

It's unfortunate Mr. Jam feels this way, maybe next time the forums will be a bit more helpful. It is important however to discuss the solution here in this forum since it may help others having similar issues.


The posted solution was to make sure and set the AVAssetWriterInput mediaTimeScale to a wanted timescale (in this case 30000). Note that the default value for this property is 0 documented as - "The default value of this property is 0, which indicates that the writer input object should choose an appropriate value."


This "appropriate" value may *not* be what you want. If you want something specific, you must set this property. However there is another setting that could be overlooked and is noted in the documentation for the above property.


"In order to avoid inconsistencies between the track's media time scale and the result's media time scale both should be set to equal or compatible values."


The reference in the preceding paragraph is to the AVAssetWriter movieTimeScale property. It also has a default value of 0, which indicates that the writer will choose a convenient value. But as as before, that "convenient" value may not be correct for your intended purpose.


Finally, since this discussion also mentioned Final Cut Pro X, Q&A 1447 makes for some good reading:


Final Cut Pro X - Preferred Video Media Time Scales and Sample Durations

If you were getting anything other than 29.97->30.00 (even 29.97->29.98 or 30.01) i'd agree with the previous poster about the mediaTimeScale, but 29.97 actual frames playing back at 30fps is NTSC, a common standard, and I'm thinking the issue is that when you append your pixel buffers, you're appending all of them, rather than dropping .3 frames every second. NTSC video has a frame rate of 29.97 fps, but the timecode counts at 30 fps, so it's sort of like the way leap year affects the number of days in February.

If the drop-frame vs. non-drop-frame stuff is new to you, there's some detailed reading about it in the FCP 7 manual: https://documentation.apple.com/en/finalcutpro/usermanual/index.html#chapter=D%26section=6%26tasks=true

Hi,

Im already changing the values of movieTimeScale and mediaTimeScale to 25000 in my case (25 FPS). But the end video result sometimes got the wrong fps.

For example:

video 1 - 12 seconds - 23.6550 fps

video 2 - 10 seconds - 25.0406 fps

video 3 - 20 seconds - 25.0260 fps


The last two video are perfect, the first one slight off, do you know what could be causing this?