Synchronizing Live Video Multiple AVPlayers

Hello, I am currently trying to synchronize two or more AVPlayers, each with a different player item (camera angle), but I am unable to get all of the player to start playing at the same time. I am also working with live video, so I don't think I can use AVCompositions.


Current Procedure:

  1. Pause all players
  2. Seek all players to the desired time
  3. If all seeks completed then Preroll all players
  4. If all prerolls completed then tell all players to Play


Code:

- (void)syncAtTime:(CMTime)time {
__block NSArray *players = self.players;
const NSUInteger prerollsNeeded = players.count;
__block NSUInteger prerollsTried = 0;
__block NSUInteger prerollsCompleted = 0;
void (^prerollHandler)(BOOL) = ^(BOOL complete) {
prerollsTried++;
if (complete) {
prerollsCompleted++;
}
if (prerollsTried >= prerollsNeeded) {
// all preroll completion handlers have ran
// play all the players now that they SHOULD be ready
if (prerollsCompleted >= prerollsNeeded) {
for (AVPlayer *player in players) {
[player play];
}
}
}
};
const NSUInteger seeksNeeded = players.count;
__block NSUInteger seeksTried = 0;
__block NSUInteger seeksCompleted = 0;
void (^seekHandler)(BOOL) = ^(BOOL complete) {
seeksTried++;
if (complete) {
seeksCompleted++;
}
if (seeksTried >= seeksNeeded) {
// all seek completion handlers have ran
// if all the seeks completed preroll and play
if (seeksComplete >= seeksNeeded) {
for (AVPlayer *player in players) {
[player prerollAtRate:1.0 completionHandler:prerollHandler];
}
}
}
};
// pause seek all the players
for (AVPlayer *player in players) {
[player pause];
[player seekToTime:time toleranceBefore:kCMTimeZero toleranceAfter:kCMTimeZero completionHandler:seekHandler];
}
}


If anyone has any tips or suggestions please let me know!


(Yes I know I should be using Swift) 🙂

Answered by ncvitak in 11672022

Update:

So I ended up using one of my technical support credits on this question, but the results weren't very pleasing regarding HTTP Live Streaming (HLS).


"Our engineers have reviewed your request and have concluded that there is no supported way to achieve the desired functionality given the currently shipping system configurations" - Apple Developer Technical Support


I guess I'll start a new post under HTTP Live Streaming, since that is the only issue now.

Thanks!

I'll check this out and post results 😀

It appears AVPlayer does not have timebase property, and the player's currentItem timebase is read only 😟

Update:

Ok, I managed to get the players synced up with local content by first initially setting the them all to use the same clock

CMClockRef syncClock;
CMAudioClockCreate(kCFAllocatorDefault, &syncClock);
for (AVPlayer *player in self.players) {
player.masterClock = syncClock;
}


Then after each of players finish prerolling in my previous implementation, instead of using setRate:

for (AVPlayer *player in players) {
[player setRate:1.0 time:kCMTimeInvalid atHostTime:CMClockGetTime(syncClock)];
}


This works very well with local content on the device, but with live video the method "setRate time: atHostTime:" seems to be doing absolutely nothing.


Please leave any tips or suggestions you may have regarding using "setRate time: atHostTime:" with live video. Thanks!

Accepted Answer

Update:

So I ended up using one of my technical support credits on this question, but the results weren't very pleasing regarding HTTP Live Streaming (HLS).


"Our engineers have reviewed your request and have concluded that there is no supported way to achieve the desired functionality given the currently shipping system configurations" - Apple Developer Technical Support


I guess I'll start a new post under HTTP Live Streaming, since that is the only issue now.

Hey ncvitak,


I am too trying to solve this mystery. I was trying to play multiple videos from http sources and was using the same approach you mentioned, with the addition of setting on of my avplayers as a master player and added a time observer to track how in sync the other players were to the master.


Still had trouble keeping them in sync after starting them in sync. I did have a bit of success playing multiple videos using an AVComposition. I had to first make sure that certain keys were loaded from the individual asset tracks first before adding it to the composition, but the syncing was much better than when I was trying to play multiple AVPlayers. The only caveat is that I have no control over the caching policy of the AVPlayer while it is playing a AVComposition. And I know that you can't use an AVAssetResourceLoaderDelegate for managing asset resources while they are part of an AVComposition, (the AVPlayer just freezes)


So that's my 2 cents. I hope this was info you could use. Please keep me posted on your findings as well.

Thanks for the AVComposition tip, I'll definitely try it out. Currently I have my players synced to a master player, and every 5 seconds the other players attempt to sync with the master unitl they are about 0.25s apart. This method is currently not very reliable, so now I am extremely interested in using the AVComposition. What keys need to be loaded in order to play HLS content in an AVComposition?

Synchronizing Live Video Multiple AVPlayers
 
 
Q