I'm using AVPlayer and sometimes the player randomly pauses. I'm observing \.timeControlStatus but the only response I get .paused with no info as to why it's paused.
I'm also observing \.isPlaybackLikelyToKeepUp, \.isPlaybackBufferEmpty, and \.isPlaybackBufferFull but nothing fires for those. However using Notification.Name.AVPlayerItemPlaybackStalled I do get a print statement that says "stalled".
I have no idea what to do because the player is justing sitting there and I get no information as to what the problem.
How can I find out the exact reason why the player is stalled?
How do I resume automatic play after the player is paused?
code is in attachment
AVPlayer Observers and Notifications - https://developer.apple.com/forums/content/attachment/98542fe1-0678-4922-806a-e8b9ae09ceed
Post
Replies
Boosts
Views
Activity
I have the same exact problem as this question - https://developer.apple.com/forums/thread/113809.
device-1 browses, discovers, and sends a message to device-2. The connection is made. I hold a copy to the outgoing connection inside an array.
device-2 receives the message and I hold a copy of the incoming connection inside an array.
The same thing happens in reverse with device-2 to device-1 (peer-to-peer) setup.
When either device goes to the background, I iterate through the array, .cancel the connection, then remove it from the array. The problem is when the connection is no longer connected, the opposite device fires browser.browseResultsChangedHandler but it never fires connection.stateUpdateHandler, it will only fire if it itself shuts the connection down.
device-1:
var connections = [NWConnection]()
@objc func backgroundNotification() {
		 connections.forEach({ $0.cancel() })
connections.removeAll()
		 // I've also tried cancelling and setting both the browser and listener to nil
}
device-1 is no longer connected, device-2 -only the browseResultsChangedHandler fires
browser.browseResultsChangedHandler = { [weak self](results, changes) in
		for change in changes {
				switch change {
						// ...
						case .removed(let browseResult):
								if case .service(let name, let type, let domain, _) = browseResult.endpoint {
										// * only this runs *
								}
						default:break
				}
		}
}
var connections = [NWConnection]()
connection.stateUpdateHandler = { [weak self](nwConnectionState) in
	// *.cancelled* and *.failed* never run
switch nwConnectionState {
// ...
case .cancelled:
			connection.cancel()
		// ... loop the array and remove the connection
case .failed(let error):
			connection.cancel()
		// ... loop the array and remove the connection
default:break
}
}
The odd thing is if device-2 goes to the background (same step as device-1), then connection.stateUpdateHandler does fire.
Just to be clear when sending messages everything works fine on both devices and in both the browser and listener handlers.
Because of the 8 peer limit I'm using NWListener, NWBrowser, NWConnection, and Bonjour for up to 20 peer-to-peer concurrent connections.
I followed this answer - https://developer.apple.com/forums/thread/652180 and this answer - https://developer.apple.com/forums/thread/661148 which are both from Apple engineers.
The first one said to create multiple concurrent NISessions:
All NISessions are peer-to-peer and as a result, creating multiple NISession objects is necessary to have multiple concurrent sessions. One approach is to create a dictionary between an identifier for the peer (i.e. a user identifier provided by your app MyAppUserID) and the NISession objects while also keeping track of the NIDiscoveryToken for each peer identifier: var sessions = [MyAppUserID: NISession]()
var peerTokensMapping = [NIDiscoveryToken: MyAppUserID]() And the second answer said to perform interactions with multiple iPhones:
Create an NISession for each peer you would like to interact with. For example, if you are interacting with two peers, create 2 * NISession objects. Each NISession will have a unique NIDiscoveryToken associated with it. Share discovery token #1 with peer #1, and share discovery token #2 with peer #2. When you receive discovery tokens from peers #1 and #2, create 2 * NINearbyPeerConfiguration objects and use them to run sessions #1 and #2, respectively. The problem is when sending out a NIDiscoveryToken via NWConnection, I can't find a way to link to the NISession from the sent out data to the token that is received from the incoming data after I initialize a NINearbyPeerConfiguration object:
eg.
User object which is sent and received when other devices are discovered
class User: NSObject, NSSecureCoding {
		var uid: String?
		var peerToken: NIDiscoveryToken? = nil
		init(uid: String, peerToken: NIDiscoveryToken) {...}
		// ... encoder for uid and peerToken
		// ... decoder for uid and peerToken
}
Send current user's NIDiscoveryToken data via NWBrowser and NWConnection and save it to the
peerTokensMapping dictionary and save the session to the sessionsArr
let currentUserId = "qwerty"
var sessionsArr = [NISession]()
var sessions = [String: NISession]()
var peerTokensMapping = [NIDiscoveryToken: String]()
func sendDataWhenNewDeviceIsDiscovered() {
		let session = NISession()
		session.delegate = self
		guard let myToken = session.discoveryToken else { return }
		let user = User(uid: currentUserId, peerToken: myToken)
		guard let outgoingData = try? NSKeyedArchiver.archivedData(withRootObject: user,
																														 requiringSecureCoding: true)
		else { return }
		let message = NWProtocolWebSocket.Metadata(opcode: .text)
		let context = NWConnection.ContentContext(identifier: "send", metadata: [message])
		connection.send(content: outgoingData, contentContext: context, isComplete: true, completion: .contentProcessed({ [weak self](error) in
		if let error = error { return }
				self?.sessionsArr.append(session)
				self?.peerTokensMapping[myToken] = self!.currentUserId
		print("Successfully Sent")
		}))
}
Receive other user's NIDiscoveryToken data via NWListener and NWConnection.
connection.receive(minimumIncompleteLength: 1, maximumLength: 65535) {
		[weak self](data, context, isComplete, error) in
		if let err = error { return }
		if let data = data, !data.isEmpty {
				self?.decodeReceived(data)
		}
}
func decodeReceived(_ data: Data) {
		guard let incomingData = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data) as? User
		else { return }
		guard let uid = incomingData.uid, let peerToken = incomingData.peerToken
		else { return }
		let config = NINearbyPeerConfiguration(peerToken: peerToken)
		/* not sure what to do here to get the session that was created when the data was sent out */
		for (key,_) in peerTokensMapping {
				if key.??? == peerToken.??? {
						session.run(config)
						self.sessions[currentUserId] = session
						break
				}
		}
		/* or try this but this is the other user's peerToken so this will never work */
		for session in self.sessionsArr {
				if session.discoveryToken ?? "" == peerToken {
						session.run(config)
						self.sessions[currentUserId] = session
						break
				}
		}
}
Once the NINearbyPeerConfiguration is initialized how do I connect the incoming peerToken with the correct one that was sent out above that is currently inside the peerTokensMapping dict or the sessionsArr so that I can
get the session and call session.run(config)
In the delegate method below, distance is of type Float:
func session(_ session: NISession, didUpdate nearbyObjects: [NINearbyObject]) {
		guard let object = nearbyObjects.first else { return }
guard let distance = object.distance else { return }
print("distance is of type Float but I need to convert to meters: ", distance)
}
How do I convert distance into meters so that I can find out exactly how far the peers are from one another?
I'm using Bonjour, NWListener, NWBrowser, NWConnection, and peer-to-peer to connect devices. Even though I'm not using Mutlipeer, I know that it is built on top of Bonjour and from what I've read the Multipeer range is 20 - 50 meters depending on what's in between the devices etc. For this question I'm assuming that is the same range for these four APIs.
If deviceA is within 50 meters of deviceB, then there will be an automatic discovery/connection. If deviceA moves outside the range to 51 meters, the connection will be lost.
If deviceA moves back within 50 meters, will there be automatic discovery/connection between them again?
The reason I ask is because another developer who has networking experience told me that I have to add a timer to the view controller and fire if off every few seconds so that discovery can keep occurring. I haven't found anything on the forums or stackoverflow that gives any examples of having to use a timer with Bonjour for discovery work. I don't know if I need to add a timer or not to rediscover devices that were once connected, got disconnected, and want to connect again.
I'm far from a Network expert and that is why I ask this question.
I'm using Bonjour, NWConnection, NWBrowser, and NWListener along with the NearInteraction framework. The NearInteraction framework works best when the devices are within 9 meters - https://developer.apple.com/documentation/nearbyinteraction/initiating_and_maintaining_a_session of each other.
NI works best when the peer devices are: Within nine meters of each other. Is there anyway that I can set Bonjour to not discover devices beyond or only within 9 meters?
I'm using NWConnection to send my discoveryTokens. When an outgoing connection is found, I have the following code:
let session = NISession()
guard let token = session.discoveryToken else {
print("discoveryToken is nil")
return
}
guard let data = try? NSKeyedArchiver.archivedData(withRootObject: token, requiringSecureCoding: true) else {
return
}
// send data ...
Why is it that the discoveryToken sometimes returns nil? Is there a remedy for this?
This has happened on multiple occasions and because of this nothing is sent. There doesn't seem to be a workaround nor any documentation on why this occurs
I'm not native to networking so maybe I'm misunderstanding how endPoint information is gathered.
Device_A is browsing and discovers Device_B or the other way around, it doesn't matter because they will both discover each other and send data to open a connection. Because the Network framework does not resolve ipAddresses, - https://developer.apple.com/forums/thread/129644 when a connection is first made I use either the remoteEndPoint (connection.currentPath?.remoteEndpoint // fe80::9821:7fff:fcea:74c4%awdl0.10059) or the endPoint description (connection.endpoint.debugDescription // myApp (2)._myApp._tcplocal.) as a uniqueID for the connection.
I send the data and some other info across the wire, if successful I then place either endPoint inside an ivar dictionary as the key with another value so that I know which device to map a response to once a device responds back.
Right now I only have 2 devices connected, the issue is when I receive a response from Device_B, I'm actually getting the incorrect endPoint information. For some reason I keep getting the endPoint info from Device_A back. It seems like the same endPoint information is getting sent twice. Once to Device_A and then again to Device_B. This exact same thing occurs on Device_B but in reverse. I'm confused as to why this is happening.
For example Device_A first discovers itself, the remoteEndPoint is *fe80::9821:7fff:fcea:74c4%awdl0.10059*, it sends the data. When Device_A receives its own message, I filter it out using the userId and I see the same endPoint info. But when Device_A discovers Device_B, the remoteEndPoint is *fe80::9821:7fff:fcea:74c4%awdl0.27788*. When I receive a response from Device_B, the endPoint information is showing the first one from Device_A *fe80::9821:7fff:fcea:74c4%awdl0.10059*. The same remoteEndPoint info is duplicated. The same exact thing is happening if I use endpoint.debugDescription. This issue occurs on both devices.
NWBrowser:
browser.browseResultsChangedHandler = { (results, changes) in
for change in changes {
switch change {
case .added(let browseResult):
switch browseResult.endpoint {
case .service(let name, let type,_,_):
let connection = PeerConnection(to: browseResult.endpoint)
// ...
}
PeerConnection:
var connection: NWConnection?
init(to endPoint: NWEndpoint) {
// tcpOptions ...
// params ...
// initialize connection and delegate that sends out data
connection.stateUpdateHandler = { (nwConnectionState) in
case .ready:
self.delegate.sendOutgoing(connection)
}
Send Data:
var dict = [String:String]()
var endPointArr = [String]()
func sendOutgoing(_ connection: NWConnection) {
let endPoint = connection.currentPath?.localEndpoint?.debugDescription
or
let endPoint = connection.currentPath?.remoteEndpoint?.debugDescription
// encode the endPoint and currentUserId with some other info, then send it across the wire, if successful make the endPoint a key inside a dictionary and add it to an array to keep track of what endPoints were received
connection.send(content: encodedData, contentContext: context, isComplete: true, completion: .contentProcessed({ [weak self](error) in {
if let error = error { return }
self?.dict[endPoint] = someUniqueValue
self?.endPointArr.append(endPoint)
}))
}
Receiving a response
connection.receive(minimumIncompleteLength: 1, maximumLength: 65535) { [weak self](data, context, isComplete, error) {
if let err = error { return }
if let data = data, !data.isEmpty {
self?.received(data)
}
}
func received(_ data) {
guard let retrievedData = try! NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data) as? MyModel else { return }
guard let endPoint = retrievedData.endPoint as? String, let userID = retrievedData.userId as? String else { return }
print(endPoint) // fe80::9821:7fff:fcea:74c4%awdl0.10059
if userID == Auth.auth().currentUser?.uid {
dict[endPoint] = nil
return
}
endPointArr.forEach { (endPoint) in
print(endPoint) // prints both fe80::9821:7fff:fcea:74c4%awdl0.10059 and fe80::9821:7fff:fcea:74c4%awdl0.27788
}
// this never runs because the key just got set to nil above because Device_B has the same endPoint info
for (key, value) in dict where key == endpoint {
print("key=\(key) : value=\(value)") // someUniqueValue
// show response that this is a response from whichever device has this endPoint
break
}
}
[1]: https://developer.apple.com/forums/thread/129644
I'm using peer-to-peer and I successfully connect one client to another but when I send the echo to the individual connection the receiving client doesn't get a response.
Initial send
let data = NYKeyedArchiver....
let message = NWProtocolWebSocket.Metadata(opcode: .text)
let context = NWConnection.ContentContext(identifier: "send",
metadata: [message])
connection.send(content: data, contentContext: context, isComplete: true, completion: .contentProcessed({ (error) in
if let error = error { return }
print("Sent")
}))
Receive data and send Echo response
func receivedIncoming(connection) {
connection.receive(minimumIncompleteLength: 1, maximumLength: 65535) { (data, context, isComplete, error) in
if let err = error {
print(err) // never gets hit
return
}
if let data = data, !data.isEmpty {
// do something with data
if let color = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data) as? UIColor {
// this should only run on the other device once echo is received after 5 secs
self?.view.backgroundColor = color
}
let randomColor = UIColor.random // func create random color
let colorData = randomColor.encode() // func encode color to data
DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
connection.send(content: colorData, completion : .idempotent)
// I've also tried
let message = NWProtocolWebSocket.Metadata(opcode: .text)
let context = NWConnection.ContentContext(identifier: "send", metadata: [message])
connection.send(content: colorData, contentContext: context, isComplete: true, completion: .contentProcessed({ (error) in
if let error = error { return }
print("Color data sent") // this always prints
}))
}
} else {
print("data is empty") // never gets hit
}
}
}
NWConnection
weak var delegate: PeerConnectionDelegate?
var connection: NWConnection?
// Outgoing Connection
init(endPoint: NWEndpoint, delegate: PeerConnectionDelegate) {
self.delegate = delegate
let tcpOptions = NWProtocolTCP.Options()
tcpOptions.enableKeepalive = true
tcpOptions.keepaliveIdle = 2
let parameters = NWParameters(tls: nil, tcp: tcpOptions)
parameters.includePeerToPeer = true
parameters.allowLocalEndpointReuse = true
connection = NWConnection(to: endPoint, using: parameters)
startOutgoingConnection()
}
// Incoming Connection
init(connection: NWConnection, delegate: PeerConnectionDelegate) {
self.delegate = delegate
self.connection = connection
startIncomingConnection()
}
func startIncomingConnection() {
connection?.stateUpdateHandler = { (nwConnectionState) in
case .ready:
self.delegate?.receivedIncoming(connection)
// ...
}
Why is the echo data being sent but not received?
I have several properties inside my CoreData Entity file named UnsavedModel, here are some of them.
I created my own file with a CollectionView and when I instantiate the newBodyText and newHttpsStr properties, it crashes. The correct values are being fed into both of the properties, which are just Strings, but it only crashes on those properties, none of the other properties. If I don't give those properties a value, everything works fine. What's the issue here?
and also
The problem occurs in this order
HomeVC > buttonPress immediately push on UnsavedVC
UnsavedVC (cells correctly load) > backButtonPressed immediately pop back to HomeVC
HomeVC > buttonPress immediately push on UnsavedVC
CRASH
FYI I have no problems writing to those properties. But when I do try to delete those 2 properties, it crashes, but have no problem deleting any of the other properties.
I've also gotten a crash on Thread 1 Queue : com.apple.main-thread (serial) and objc_msgSend for the same push/pop/push issue.
code:
class UnsavedController: UIViewController, UICollectionViewDataSource, UICollectionViewDelegateFlowLayout {
var datasource = [CopyCoreDataModel]()
override func viewDidLoad() {
super.viewDidLoad()
fetchData()
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
navigationController?.setNavigationBarHidden(false, animated: false)
}
func fetchData() {
guard let appDelegate = UIApplication.shared.delegate as? AppDelegate else { return }
let context = appDelegate.persistentContainer.viewContext
// I also tried context.performAndWait { ... run the do-try-catch fetchRequest in here... }
let fetchRequest: NSFetchRequest<UnsavedModel> = UnsavedModel.fetchRequest()
do {
let results = try context.fetch(fetchRequest)
for result in results {
guard let id = result.value(forKey: "id") as? String else { continue }
let isContained = tableData.contains(where: { $0.id ?? "" == id })
if !isContained {
let copy = CopyCoreDataModel(id: id, unsavedModel: result)
datasource.append(copy)
}
}
collectionView.reloadData()
} catch {
print(error)
}
}
}
// CopyCoreDataModel is necessary because I run some functions based on certain properties
class CopyCoreDataModel {
var fileUrl: URL?
var id: String?
var unsavedModel: UnsavedModel?
var postDate: Double?
// otherProperties of type String, Double, and Boolean
var micUrl: String?
var newBodyText: String?
var newHttpsStr: String?
init(id: String, unsavedModel: UnsavedModel) {
self.id = id
self.unsavedModel = unsavedModel
// self otherProperties = unsavedModel.otherProperties // run some function on some of the other properties. These all work perfectly fine
if let micUrl = unsavedModel.micUrl { // works perfectly fine
self.micUrl = micUrl
// function to get micURL from FileManager that eventually sets self.fileUrl
}
if let newBodyText = unsavedModel.newBodyText { // crash occurs here only if it has a value
self.newBodyText = newBodyText
}
if let newHttpsStr = unsavedModel.newHttpsStr { // crash occurs here only if it has a value
self.newHttpsStr = newHttspStr
}
}
}
func writeData(micURL: URL) {
guard let appDelegate = UIApplication.shared.delegate as? AppDelegate else { return }
let context: NSManagedObjectContext = appDelegate.persistentContainer.viewContext
let entity: NSEntityDescription = NSEntityDescription.entity(forEntityName: "UnsavedModel", in: context)!
let object: NSManagedObject = NSManagedObject(entity: entity, insertInto: context)
object.setValue(UUID().uuidString, forKey: "id")
object.setValue(micURL.path, forKey: "micUrl")
object.setValue("abc", forKey: "newBodyText")
object.setValue("https...", forKey: "newHttpsStr")
// set other properties
do {
try context.save()
} catch let error as NSError {
print("could not save . \(error), \(error.userInfo)")
}
}
I also setup a CoreDataManager sharedInstance class and accessed the context through there let context = CoreDataManager.sharedInstance.persistentContainer.viewContext but the same issue occurs
I also changed the CopyCoreDataModel's initializer to use a dict of the k/v from the unsavedModel but the crash still occurs
class CopyCoreDataModel {
// same exact properties
init(id: String, dict: [String: Any]) {
// set the properties using the values from the dict
}
}
func fetchData() {
let context = // ...
let fetchRequest = // ...
do {
let results = try context.fetch(fetchRequest)
for result in results {
guard let id = result.value(forKey: "id") as? String else { continue }
let isContained = datasource.contains(where: { $0.id ?? "" == id })
if !isContained {
let dict = createDictFromUnsavedModel(unsavedModel: result)
let copy = CopyCoreDataModel(id: id, dict: dict)
datasource.append(copy)
}
}
collectionView.reloadData()
} catch {
}
}
func createDictFromUnsavedModel(unsavedModel: UnsavedModel) -> [String:Any] {
var dict = [String: Any]()
// set dict using k/v from unsavedModel
return dict
}
When I try to access the cloud database it says No Containers
I have everything set up correctly. Everything in the pics below with the gray outline has the same exact identifier iCloud.com.myCo.myApp
Entitlements:
Xcode:
developer.apple:
I did a deep clean, closed Xcode, reopened it, deep clean again, build. Still No Containers.
This is a paid account.
I'm running Xcode 12.4, testing on iOS 13 and 14.
How can a user be logged in/iCloud Available, but still Not Authenticated?
When I fetch a record, the CKErrorCode is NotAuthenticated and it prints Couldn't get an authentication token
let predicate = NSPredicate(format: "id == %@", id)
let query = CKQuery(recordType: RecordType.Media, predicate: predicate)
CKContainer.default().publicCloudDatabase.perform(query, inZoneWith: nil) { (ckRecords, error) in
if let err = error {
let code = err._code
// error code is 9 - case NotAuthenticated /* Not authenticated (writing without being logged in, no user record) */
return
}
// ...
}
But when I check to see if the user is logged in, I use the code below, and in both situations it prints iCloud Available
1-
if FileManager.default.ubiquityIdentityToken != nil {
print("iCloud Available")
} else {
print("iCloud Unavailable")
}
2-
CKContainer.default().accountStatus { (accountStatus, error) in
switch accountStatus {
case .available:
print("iCloud Available")
case .noAccount:
print("No iCloud account")
case .restricted:
print("iCloud restricted")
case .couldNotDetermine:
print("Unable to determine iCloud status")
@unknown default:
print("Unable to determine iCloud status")
}
}
FYI, this seems to happen in this order.
1- I'm home on wiFi, logged into iCloud, everything works fine.
2- I leave my house, switch to a hot spot, I'm still logged into iCloud and everything works fine
3- I come back into my house, switch back to wiFi, of course I'm still logged in, then the above issue occurs. It's as if it wants me to log in again even though I'm already logged in and it says iCloud Available.
This issue is happening on both the real device and the simulator.
UPDATE
I just found this post, it seems lots of devs are having this same problem
In both situations below I'm successfully connected to WiFi. Using Safari and Chrome (I have a VPN on chrome) with my mac I can successfully connect to google, apple, and youtube.
When I use an actual device to connect to any website using WKWebView everything works fine.
But when I try to connect to any website (including google, apple, and youtube) using WKWebView and the simulator I get this error.
func webView(_ webView: WKWebView, didFailProvisionalNavigation navigation: WKNavigation!, withError error: Error) {
let code = error._code
print(code) // prints 1200
print(error.localizedDescription)) // An SSL error has occurred and a secure connection to the server cannot be made
}
Why can I successfully connect using a real device but not the simulator? Btw this just started happening today. It never happened before and I didn't change any Xcode settings.
I found an answer that said in the simulator go to Settings > Developer > Allow HTTP Services and to toggle it on. It still doesn't work wether on or off.
I also tried this which works fine on a real device using WKWebView
<key>NSAppTransportSecurity</key>
<dict>
<key>NSAllowsArbitraryLoads</key>
<true/>
</dict>
But on the simulator using WKWebView the website still doesn't show and I get the error:
The certificate for this server is invalid. You might be connecting to a server that is pretending to be "wwwgoogle.com" which could put your confidential information at risk.
I have an audio url (.m4a) that I create using the AVAudioRecorder. I want to share that audio on Instagram so I convert the audio to a video. The issue is after the conversion, when I save the video url to the Files app using the UIActivityViewController, I can replay the video, see the playback time (eg 7 seconds) and hear the audio with no problem. A black screen with a sound icon appears.
But when I save the same exact converted audio-video file to the Photos Library using the UIActivityViewController, inside the Photos Library the video shows the 7 seconds but nothing plays, the video is all gray, and the sound icon doesn't show.
Why is the video successfully saving/playing in the Files app but saving and not playing in the Photos Library?
I tried setting the exporter.outputFileType as both .mov and .mp4 and the issue is exactly the same.
let asset: AVURLAsset = AVURLAsset(url: audioURL)
let mixComposition = AVMutableComposition()
guard let compositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: CMPersistentTrackID()) else { return }
let track = asset.tracks(withMediaType: .audio)
guard let assetTrack = track.first else { return }
do {
try compositionTrack.insertTimeRange(CMTimeRangeMake(start: .zero, duration: assetTrack.timeRange.duration), of: assetTrack, at: .zero)
} catch {
print(error.localizedDescription)
}
guard let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough) else { return }
let dirPath = NSTemporaryDirectory().appending("\(UUID().uuidString).mov")
let outputFileURL = URL(fileURLWithPath: dirPath)
exporter.outputFileType = .mov // I also tried .mp4
exporter.outputURL = outputFileURL
exporter.shouldOptimizeForNetworkUse = true
exporter.exportAsynchronously {
switch exporter.status {
// ...
guard let videoURL = exporter.outputURL else { return }
// present UIActivityViewController to save videoURL and then save it to the Photos Library via 'Save Video`
}
}
I followed the [Ray Wenderlich]tutorial to merge videos. The finished result is 1 merged video where portrait videos are at the top of the screen and landscape videos are at the bottom of the screen. In the image below the portrait videos plays first and then landscape video plays after it. The landscape video is from the Photos Library.
code:
let mixComposition = AVMutableComposition()
let videoCompositionTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
let audioCompositionTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
var count = 0
var insertTime = CMTime.zero
var instructions = [AVMutableVideoCompositionInstruction]()
for videoAsset in arrOfAssets {
let audioTrack = videoAsset.tracks(withMediaType: .audio)[0]
do {
try videoCompositionTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.duration), of: videoAsset.tracks(withMediaType: .video)[0], at: insertTime)
try audioCompositionTrack?.insertTimeRange(CMTimeRangeMake(start: .zero, duration: videoAsset.duration), of: audioTrack, at: insertTime)
let layerInstruction = videoCompositionInstruction(videoCompositionTrack!, asset: videoAsset, count: count)
let videoCompositionInstruction = AVMutableVideoCompositionInstruction()
videoCompositionInstruction.timeRange = CMTimeRangeMake(start: insertTime, duration: videoAsset.duration)
videoCompositionInstruction.layerInstructions = [layerInstruction]
instructions.append(videoCompositionInstruction)
insertTime = CMTimeAdd(insertTime, videoAsset.duration)
count += 1
} catch { }
}
let videoComposition = AVMutableVideoComposition()
videoComposition.instructions = instructions
videoComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
videoComposition.renderSize = CGSize(width: UIScreen.main.bounds.width, height: UIScreen.main.bounds.height)
// ...
exporter.videoComposition = videoComposition
AVMutableVideoCompositionLayerInstruction:
func videoCompositionInstruction(_ track: AVCompositionTrack, asset: AVAsset, count: Int) -> AVMutableVideoCompositionLayerInstruction {
let instruction = AVMutableVideoCompositionLayerInstruction(assetTrack: track)
let assetTrack = asset.tracks(withMediaType: .video)[0]
let transform = assetTrack.preferredTransform
let assetInfo = orientationFromTransform(transform)
var scaleToFitRatio = UIScreen.main.bounds.width / assetTrack.naturalSize.width
if assetInfo.isPortrait {
scaleToFitRatio = UIScreen.main.bounds.width / assetTrack.naturalSize.height
let scaleFactor = CGAffineTransform(scaleX: scaleToFitRatio, y: scaleToFitRatio)
instruction.setTransform(assetTrack.preferredTransform.concatenating(scaleFactor), at: .zero)
} else {
let scaleFactor = CGAffineTransform(scaleX: scaleToFitRatio, y: scaleToFitRatio)
var concat = assetTrack.preferredTransform.concatenating(scaleFactor)
.concatenating(CGAffineTransform(translationX: 0,y: UIScreen.main.bounds.width / 2))
if assetInfo.orientation == .down {
let fixUpsideDown = CGAffineTransform(rotationAngle: CGFloat(Double.pi))
let windowBounds = UIScreen.main.bounds
let yFix = assetTrack.naturalSize.height + windowBounds.height
let centerFix = CGAffineTransform(translationX: assetTrack.naturalSize.width, y: yFix)
concat = fixUpsideDown.concatenating(centerFix).concatenating(scaleFactor)
}
instruction.setTransform(concat, at: .zero)
}
if count == 0 {
instruction.setOpacity(0.0, at: asset.duration)
}
return instruction
}
Orientation:
func orientationFromTransform(_ transform: CGAffineTransform) -> (orientation: UIImage.Orientation, isPortrait: Bool) {
var assetOrientation = UIImage.Orientation.up
var isPortrait = false
let tfA = transform.a
let tfB = transform.b
let tfC = transform.c
let tfD = transform.d
if tfA == 0 && tfB == 1.0 && tfC == -1.0 && tfD == 0 {
assetOrientation = .right
isPortrait = true
} else if tfA == 0 && tfB == -1.0 && tfC == 1.0 && tfD == 0 {
assetOrientation = .left
isPortrait = true
} else if tfA == 1.0 && tfB == 0 && tfC == 0 && tfD == 1.0 {
assetOrientation = .up
} else if tfA == -1.0 && tfB == 0 && tfC == 0 && tfD == -1.0 {
assetOrientation = .down
}
return (assetOrientation, isPortrait)
}