pip video composition use case using custom compositor

I am following the sample code at

https://github.com/robovm/apple-ios-samples/tree/master/AVCustomEdit/AVCustomEdit

The stitching of videos and transition works fine.

I want to update the code to implement picture in picture video use case (maintaining use of APLCustomVideoCompositionInstruction design).

Any guidance on this will be much appreciated.


I have updated the class with 2 new functions: setPassThroughTrackID() and setTransitionWithSourceTrackIDs().

     - (void)setPassThroughTrackID:(CMPersistentTrackID)passthroughTrackID forTimeRange:(CMTimeRange)timeRange 
     { 
          _passthroughTrackID = passthroughTrackID;
          _requiredSourceTrackIDs = nil; 
          _timeRange = timeRange; 
          _containsTweening = FALSE; 
          _enablePostProcessing = FALSE; 
          return; 
     }  

     - (void)setTransitionWithSourceTrackIDs:(NSArray *)sourceTrackIDs forTimeRange:(CMTimeRange)timeRange
     { 
          _requiredSourceTrackIDs = sourceTrackIDs; 
          _passthroughTrackID = kCMPersistentTrackID_Invalid; 
          _timeRange = timeRange; 
          _containsTweening = TRUE; 
          _enablePostProcessing = FALSE; 
          return; 
     }

Here is a sample code for creating a composition instruction for normal stitching:


  sourceClips = [[NSMutableArray alloc] initWithCapacity:count];
  // Here creating a list of AVAsset using URL.
  [sourceClips addObject:[AVAsset assetWithURL:pUrlString1]];
  [sourceClips addObject:[AVAsset assetWithURL:pUrlString2]];

  AVMutableComposition *composition = [AVMutableComposition composition];
  NSMutableArray* compositionVideoTracks = [NSMutableArray array];

  CMTimeRange *passThroughTimeRanges = (CMTimeRange*)alloca(sizeof(CMTimeRange) * 2);
  CMTimeRange *srcClipTimeRanges = (CMTimeRange*)alloca(sizeof(CMTimeRange) * 2);
  CMTime *dstClipTimeStamp = (CMTime*)alloca(sizeof(CMTimeRange) * 2);

  int clipID = 0;
  passThroughTimeRanges[clipID] = CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1));
  srcClipTimeRanges[clipID] = CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1));
  dstClipTimeStamp[clipID] = CMTimeMake(0, 1);

  AVMutableCompositionTrack* track = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  [compositionVideoTracks addObject:track];

  AVURLAsset *asset = [_clips objectAtIndex:clipID];
  AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

  [compositionVideoTracks[clipID] insertTimeRange:srcClipTimeRanges[clipID] ofTrack:clipVideoTrack atTime:dstClipTimeStamp[clipID] error:&error];

  AVMutableCompositionTrack * track0 = compositionVideoTracks[clipID];
  APLCustomVideoCompositionInstruction *videoInstruction = [APLCustomVideoCompositionInstruction alloc];
  [videoInstruction setPassThroughTrackID:track0.trackID forTimeRange:passThroughTimeRanges[clipID]];



For PIP I tried to give same timerange for 2 video tracks, but the output is not created.

I am hard coding the time durations for 4s.


  sourceClips = [[NSMutableArray alloc] initWithCapacity:size];
  // Here I am adding AVAsset
  [sourceClips addObject:[AVAsset assetWithURL:pUrlString1]];
  [sourceClips addObject:[AVAsset assetWithURL:pUrlString2]];

  AVMutableComposition *composition = [AVMutableComposition composition];
  NSMutableArray* compositionVideoTracks = [NSMutableArray array];

  CMTimeRange *passThroughTimeRanges = (CMTimeRange*)alloca(sizeof(CMTimeRange) * 2);
  CMTimeRange *srcClipTimeRanges = (CMTimeRange*)alloca(sizeof(CMTimeRange) * 2);
  CMTime *dstClipTimeStamp = (CMTime*)alloca(sizeof(CMTimeRange) * 2);

  int clipID = 0;
  passThroughTimeRanges[clipID] = CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1));
  srcClipTimeRanges[clipID] = CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1));
  dstClipTimeStamp[clipID] = CMTimeMake(0, 1);

  AVMutableCompositionTrack* track = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  [compositionVideoTracks addObject:track];

  AVURLAsset *asset = [sourceClips objectAtIndex:clipID];
  AVAssetTrack *clipVideoTrack1 = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

  [compositionVideoTracks[clipID] insertTimeRange:srcClipTimeRanges[clipID] ofTrack:clipVideoTrack1 atTime:dstClipTimeStamp[clipID] error:&error];

  AVMutableCompositionTrack * track0 = compositionVideoTracks[clipID];
  APLCustomVideoCompositionInstruction *videoInstruction = [APLCustomVideoCompositionInstruction alloc];

  // trying to add pip
  {
  clipID = 1;
  passThroughTimeRanges[clipID] = CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1));
  srcClipTimeRanges[clipID] = CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1));
  dstClipTimeStamp[clipID] = CMTimeMake(0, 1);

  AVMutableCompositionTrack* trackPip = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
  [compositionVideoTracks addObject:trackPip];
  AVURLAsset *asset = [sourceClips objectAtIndex:clipID];
  AVAssetTrack *clipVideoTrack2 = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

  [compositionVideoTracks[clipID] insertTimeRange:srcClipTimeRanges[clipID] ofTrack:clipVideoTrack2 atTime:dstClipTimeStamp[clipID] error:&error];

  AVMutableCompositionTrack * track1 = compositionVideoTracks[clipID];
  }
  videoInstruction.foregroundTrackID = track0.trackID;
  videoInstruction.backgroundTrackID = track1.trackID;
  [videoInstruction setTransitionWithSourceTrackIDs:@[[NSNumber numberWithInt:track0.trackID], [NSNumber numberWithInt:track1.trackID]] forTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMake(4, 1))];