Posts

Post not yet marked as solved
4 Replies
4.6k Views
Some of my code for combining an .mp4 and an .aac file together into an .mov file worked just fine since its very beginning, it creats an AVMutableComposition with 1 video track and 1 audio track from the input audio and video files. After exportAsynchronously(), the AVAssetExportSession gets AVAssetExportSession.Status.completed. Here is the Swift source code of the key function: func compileAudioAndVideoToMovie(audioInputURL:URL, videoInputURL:URL) { let docPath:String = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]; let videoOutputURL:URL = URL(fileURLWithPath: docPath).appendingPathComponent("video_output.mov"); do { try FileManager.default.removeItem(at: videoOutputURL); } catch {} let mixComposition = AVMutableComposition(); let videoTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: kCMPersistentTrackID_Invalid); let videoInputAsset = AVURLAsset(url: videoInputURL); let audioTrack = mixComposition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid); let audioInputAsset = AVURLAsset(url: audioInputURL); do { try videoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTimeMake(value: 0, timescale: 1000), duration: CMTimeMake(value: 3000, timescale: 1000)), of: videoInputAsset.tracks(withMediaType: AVMediaType.video)[0], at: CMTimeMake(value: 0, timescale: 1000));// Insert an 3-second video clip into the video track try audioTrack?.insertTimeRange(CMTimeRangeMake(start: CMTimeMake(value: 0, timescale: 1000), duration: CMTimeMake(value: 3000, timescale: 1000)), of: audioInputAsset.tracks(withMediaType: AVMediaType.audio)[0], at: CMTimeMake(value: 0, timescale: 1000));// Insert an 3-second audio clip into the audio track let assetExporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough); assetExporter?.outputFileType = AVFileType.mov; assetExporter?.outputURL = videoOutputURL; assetExporter?.shouldOptimizeForNetworkUse = false; assetExporter?.exportAsynchronously { switch (assetExporter?.status) { case .cancelled: print("Exporting cancelled"); case .completed: print("Exporting completed"); case .exporting: print("Exporting ..."); case .failed: print("Exporting failed"); default: print("Exporting with other result"); } if let error = assetExporter?.error { print("Error:\n\(error)"); } } } catch { print("Exception when compiling movie"); } }However, after I upgraded my iPhone to iOS13(beta), it always ends up with .failed status, and the AVAssetExportSession.error reads: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12735), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x2815e9bc0 {Error Domain=NSOSStatusErrorDomain Code=-12735 "(null)”}}I've tested this on iPhone6Plus and iPhone7, both given the same result.You can clone my minimal demo project with sample input audio and video files embedded in its bundle from here:https://github.com/chenqiu1024/iOS13VideoRecordingError.git, run it and check the console output.Is there any explanation and suggestion?
Posted
by Cyllenge.
Last updated
.