1 Reply
      Latest reply on Jul 11, 2019 10:22 PM by maark6000
      maark6000 Level 1 Level 1 (0 points)

        Hi all... building a macOS app that will allow users to gather data from certain areas of a quicktime movie.  The user should be able to specify IN and OUT timecodes of what section of the video to examine, and the app should be able to seek to that area and do its work.


        I thought I was on to the way of doing this by creating a CMTimeRange...




            NSDictionary *options = @{AVURLAssetPreferPreciseDurationAndTimingKey: @YES};

            AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoFileURL options:options];

            NSError *error;

            AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];

            if (error) {

                NSLog(@"Error:%@", error.localizedDescription);



            AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];


            AVAssetReaderTrackOutput *trackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:nil];

            [assetReader addOutput:trackOutput];


            CMTime start = CMTimeMakeWithSeconds(_distanceFromVideoToStart, rate->fps); //this is the duration to be skipped

            CMTime duration = CMTimeMakeWithSeconds(_distanceToBeScanned, rate->fps); //this is the duration to be examined.

            CMTimeRange range = CMTimeRangeMake(start, duration);


            [assetReader setTimeRange:range];


            [assetReader startReading];

            while ((sampleBuffer = [trackOutput copyNextSampleBuffer])) {

                  //gathering video data

             } ```


        This sort of works.  My main problem is that if the user enters a "start" time 10 seconds into the video... a frame specific calculation needs to be made... if the video is 30fps... we need to skip ahead 300 frames, but at 24fps only 240.  I thought that was what I was setting with the CMTime start = function... but perhaps I don't understand how this works?  My values would be (for a 30fps video) 300 for start time and 30 fps... my understanding is that the output is A / B... so 300 / 30 is 10 seconds.  That doesn't seem to work, I'm having to actually divide my first parameter by the frame rate (300 / 30, 30) in order for it to be close.


        The other problem as far as I can tell, is that at the moment I'm being returned 24fps values... as if they are a default setting.  So when I ask to seek 10 seconds in, it's really only seeking 8.08 seconds in, so the area being examined is incorrect.


        So, first off... is this the best way to be seeking in a .mov?  Next, how can I set the frame rate?  And lastly... what am I not understanding about this CMTimeRange? 


        Thanks for taking the time to help me.

        • Re: how to seek in a quicktime file?
          maark6000 Level 1 Level 1 (0 points)

          Well, after much experimentation… I’ve figured it out.


          CMTimeRange doesn’t work, I’m not sure why, too coarse and doesn’t always return the expected results.  The thing you really want as your guide through a QuickTime movie is either the PTS (presentationTimeStamp) or the DTS (decodeTimeStamp).  Probably PTS.  And, to seek we will use AVSampleCursor, and just by mentioning AVSampleCursor we will be creating the eighth known mention of it in the history of this planet according to google.


          Let’s set up the preliminaries…


              NSDictionary *options = @{AVURLAssetPreferPreciseDurationAndTimingKey: @YES};

              AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:yourVideoFileURL options:options];


              AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

              if (videoTrack.canProvideSampleCursors) {

                  NSLog(@"can provide sample cursors.");



          Here is some fun code to return to you the native frame rate of your video file, should you desire it...


              double fps = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject].nominalFrameRate;

              NSLog(@"FPS from asset: %f", fps);




              CMTimeScale timeScale = videoTrack.naturalTimeScale;

              NSLog(@"natural time scale: %d", timeScale); //ooh, i do like my timeScales natural.


          now let's make our AVSampleCursor...


              sint64 distanceToSeekInFrames = 300; //or whatever distance you wish to seek

              AVSampleCursor *cursor = [videoTrack makeSampleCursorAtFirstSampleInDecodeOrder]; //easiest option.

              [cursor stepInPresentationOrderByCount:distanceToSeekInFrames];


              CMTime pts = cursor.presentationTimeStamp;  //lets see where we are in the video file

              sint64 ptsNumber = pts.value;  //used in case you need to compare with a number.



          now we can query the AVSampleCursor for all kinds of information about the sample.  Notice... we are not using a sample buffer here.  Here's an example... (AHEM Apple Documentation writers!)


              CMTime sampleDuration = cursor.currentSampleDuration;

              NSLog(@"sample duration...");



          See the AVSampleCursor documentation page for the variety of inquires you can make of each frame.  Now let's jog down to the next frame:


              [cursor stepInPresentationOrderByCount:1];  //again in frames... use negative numbers to move backward.

              CMTime pts = cursor.presentationTimeStamp;  //check to see if we indeed moved down a frame.



          The one problem I was not able to solve was how AVSampleCursor and AVAssetReader might work together.  I don't know if it's possible... I never once saw an AVAssetReader function that could take a cursor as an argument.  So... go fish on that one.  I'd love to see something like sampleBuffer = [trackOutput copySampleBufferAtCursor:cursor]; but, alas, I'm dreaming.


          So, answering your own question in depth with actual code examples... that like bumps me up to level 1000 right???