I am having trouble merging my videos in the correct order even though I am waiting for them to be done with a AVAsynchronousKeyValueLoading. The videos are playing back perfectly fine, but I can't the final result to show in the correct order.
At first I thought it was an issue with my videos being saved to the NSTemporaryDirectory, but I made sure that they were being saved correctly. (I can provide logs upon request).
Here is the order that I want the videos to playback in:
1st video --> 2nd video --> 3rd video
Here is the order that it is playing back in:
2nd video --> 1st video --> 3rd video
It works perfectly fine when I don't switch the camera though. Can anyone please help? It would be greatly appreciated and put me back on schedule for a production release of my app.
func mergeVideo()
var mixComposition = AVMutableComposition()
var resultMergedVideoTime = CMTime(seconds: 0, preferredTimescale: 1)
var resultTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
let myGroup = DispatchGroup()
for var i in 0..<self.resultVideoAssets.keys.count {
myGroup.enter()
let asset = Array(self.resultVideoAssets.keys)[i]
asset.loadValuesAsynchronously(forKeys: ["playable"], completionHandler: {
if asset.statusOfValue(forKey: "playable", error: nil) == .loaded {
do {
let videoDuration: CMTime = asset.duration
let videoTrack = asset.tracks(withMediaType: AVMediaTypeVideo).first!
var audioTrack:AVAssetTrack?
if asset.tracks(withMediaType: AVMediaTypeAudio).count > 0 {
audioTrack = asset.tracks(withMediaType: AVMediaTypeAudio).first
}
let timeRange = CMTimeRange(start: kCMTimeZero, duration: videoDuration)
let audioCompTrack = mixComposition.addMutableTrack(withMediaType: AVMediaTypeAudio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
try resultTrack.insertTimeRange(timeRange, of: videoTrack, at: kCMTimeZero)
resultTrack.preferredTransform = videoTrack.preferredTransform
if let audioTrack = audioTrack {
try audioCompTrack.insertTimeRange(timeRange, of: audioTrack, at: resultMergedVideoTime)
}
resultMergedVideoTime = CMTimeAdd(resultMergedVideoTime, videoDuration)
} catch {
//TODO: Error handler
}
}
myGroup.leave()
})
}
myGroup.notify(queue: .main) {
let videoURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("mergeVideo\(arc4random()%1000)d")!.appendingPathExtension("mp4")
let assetExport: AVAssetExportSession
if #available(iOS 11.0, *) {
assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHEVC1920x1080)!
} else {
assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPreset1920x1080)!
}
assetExport.outputURL = videoURL
assetExport.outputFileType = AVFileTypeMPEG4
assetExport.shouldOptimizeForNetworkUse = true
assetExport.exportAsynchronously {
.........