Posts

Post not yet marked as solved
1 Replies
482 Views
I've been using AVAssetExportSession to trim audio files for the past 2 years, and suddenly it stopped working properly. It still works fine when I run my app on a phone running iOS 16, but on my iOS 17 phone it exports an incorrect duration (ex. I'll provide a file with 2 seconds duration, ask it to trim it to 0 - 1.7s, it'll return the file overtrimmed at 1.58s or something like that). The AVURLAsset is returning the correct duration, I've already tried using the AVURLAssetPreferPreciseDurationAndTimingKey, it's useless to me, as the error happens somewhere during the export. guard let exportSession = AVAssetExportSession(asset: asset, presetName: AVAssetExportPresetAppleM4A) else { completion(false, nil) return } let startTime = CMTimeMakeWithSeconds(floor(startPoint * 100) / 100.0, preferredTimescale: 44100) let stopTime = CMTimeMakeWithSeconds(ceil(endPoint * 100) / 100.0, preferredTimescale: 44100) let exportTimeRange = CMTimeRange(start: startTime, end: stopTime) exportSession.timeRange = exportTimeRange exportSession.outputFileType = .m4a exportSession.outputURL = targetURL AudioHelper.deleteFile(at: exportSession.outputURL) exportSession.exportAsynchronously { ... } I've managed to somewhat mitigate the damage by adding silence to the file and continuously trimming it until I get it close to my required duration, but it's an extremely ugly hack and it's breaking down the whole functionality of my app.
Posted Last updated
.
Post not yet marked as solved
0 Replies
471 Views
Hi, I'm using an AVMutableComposition in order to add 0.1 seconds of sound from an .aac file to an .m4a file, then exporting that composition via AVAssetExportSession: //after loadValuesAsynchronously tracks&duration for both silenceAsset and finalAsset let sourceAudioTrack = finalAsset.tracks(withMediaType: .audio).first let silenceAudioTrack = silenceAsset.tracks(withMediaType: .audio).first let track = composition.addMutableTrack(withMediaType: .audio, preferredTrackID: kCMPersistentTrackID_Invalid) let startTime = CMTime(seconds: 0, preferredTimescale: 1) let trackDuration = finalAsset.duration let tRange = CMTimeRange(start: startTime, duration: trackDuration) let trackMix = AVMutableAudioMixInputParameters(track: track) trackMix.setVolume(1, at: startTime) do { try track.insertTimeRange(tRange, of: sourceAudioTrack, at: startTime) let endRange = CMTimeRange(start: startTime, duration: CMTimeMakeWithSeconds(0.1, preferredTimescale: 100)) try track.insertTimeRange(endRange, of: silenceAudioTrack, at: trackDuration) } catch {} let exporter = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetAppleM4A) let audioMix = AVMutableAudioMix() audioMix.inputParameters = [trackMix] exporter.audioMix = audioMix exporter.outputFileType = .m4a exporter.outputURL = "whatever.m4a" exporter.exportAsynchronously { guard exporter.status == .completed, let outputURL = exporter.outputURL else { completion(nil) return } completion(outputURL) } The file resulted from the export session doesn't always have the duration expected and reported by track.timeRange.end.seconds right after adding all the timeRanges. In my specific case, I have the unadulterated m4a file with 1.87 seconds. I take it through this process once, I get a file with 1.97 seconds. I take this new file and do it again - track.timeRange reports a total duration of 2.07 seconds, which is correct, but the exported file has 2.136s. I do it again with the exported file, track.timeRange reports 2.236s, the exported file has 2.321s. I've never encountered this issue before and I have been trimming/appending to audio files with this method for 2 years now. Is it a system bug? Am I missing something?
Posted Last updated
.
Post not yet marked as solved
1 Replies
556 Views
I'm developing and app that also has a web component. This web component allows users to pay for a subscription that would provide them with credits that can be used for several features within the software. Features aren't inherently locked behind these credits. My question is, do I need to implement this subscription in my app as an in-app purchase? Or can I just allow the users to manage their subscriptions by sending them to the web? 3.1.1 mentions that "Apps and their metadata may not include buttons, external links, or other calls to action that direct customers to purchasing mechanisms other than in-app purchase" but I would not be sending them directly to a payment page, but rather a subscription management or a pricing overview page. Would my app get rejected if I were to just send users to the web page for subscription management?
Posted Last updated
.
Post not yet marked as solved
1 Replies
751 Views
Hi. I'm currently using WKWebView to load our application's SSO page through Microsoft and I'm need to access the device identity certificates for the NSURLAuthenticationMethodClientCertificate in order to bypass MFA via device trust. I've read it would be easier with SFSafariViewController, but that's not an option for me, as I need to set custom cookies prior to loading the page, a thing which I understand is not possible with SF. Is there any way to access these certificates through WKWebView? Or is there anything else I can use that would let me both set custom cookies and access device identity certificates?
Posted Last updated
.
Post not yet marked as solved
1 Replies
799 Views
Hi. I do some audio editing in my app, so I load audio files in AVURLAssets and then use loadValuesAsynchronously in order to load their tracks and durations. However, sometimes, AVURLAsset will stop working completely, thus returning an error with the code "dta?" when trying to loadValuesAsynchronously. I know that would mean corrupt audio normally, but they're the exact same unaltered audio files, so that can't be the issue. Not to mention that a device reboot will fix this issue right away. But that's obviously not a solution I can tell the users to use. Has anyone been encountering this? Edit: Forgot to mention, as it might be relevant - I do use AVAssetExportSession after loading said assets' properties.
Posted Last updated
.
Post marked as solved
1 Replies
1.9k Views
I've recently updated to 16.2 beta 4 and now I can't use the AVAudioRecorder at all. Calling the .record() function always returns false. There's nothing wrong when calling .prepareToPlay() or when setting the AVAudioSession's category to .record. What's more, if I start a screen recording, with the mic enabled, and then make my way up to the point where the recorder is initialized, it'll work. Any other case, like disabling the mic for the screen recording, or starting the screen recording AFTER the recorder was initialized, will not work again. Has anyone run into this so far?
Posted Last updated
.
Post not yet marked as solved
0 Replies
591 Views
I've had multiple reports from users that when they tapped a cell in the UITableView, the app did nothing for 5-10 seconds. I tried fixing this to no results, but once they restarted their phone, the app worked normally again, with no delay whatsoever after tapping the cells. Is this a known iOS 14 bug?
Posted Last updated
.
Post not yet marked as solved
0 Replies
853 Views
I've been having multiple reports and video evidence from users about music glitching out/skipping mid-playback. My app either mixes two AVAudioPlayers (one representing recorded audio, the other representing music) if the music source is either iTunes-purchased music, music found in the Files app, music created specifically for our app provided in our bundle, OR it mixes an AVAudioPlayer and the MPMusicPlayerApplicationController.applicationMusicPlayer if the music source is an Apple Music subscription. Regardless of the music source, it seems like users are encountering this issue. I have been unable to replicate this at all, yet I'm receiving a lot of tickets about this (and another issue that the app plays random songs instead of the ones selected? Which makes no sense, as I hadn't updated the app in months when this issue started arising). Has anyone encountered something like this? One common denominator I have noticed between all the tickets is that, even though some of them could replicate on their iPhones, ALL of them could replicate and were reporting the issue for their iPads with iPadOS 14.3/14.4.1/14.4.2.
Posted Last updated
.
Post marked as solved
19 Replies
4k Views
I'm using the MPMediaLibrary's addItem(withProductID:completionHandler:) function: https://developer.apple.com/documentation/mediaplayer/mpmedialibrary/1621278-additem but the completion is never called, no matter where I call it in my code, regardless if the item was or wasn't already in the user's library. Has the API changed or is this a beta issue?
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.5k Views
Hi,I've noticed that the MusicKit documentation contains the definition of the "setItemPositionProvider" method, which allows you to set the starting position of a media item in the Apple Music player's queue. But this method is not present in the 1.0.0 version of the SDK, which launched on 31 July 2019. Is there any other way to set the start point of a song, such as we do in iOS with the currentPlaybackTime property?The seekTo method is extremely flawed, as it takes a long time to buffer the song and to actually get to the desired point of the song.Also, is there any way of using the player without a foreground service?
Posted Last updated
.
Post not yet marked as solved
0 Replies
662 Views
The following link states that all Android apps must support 64-bit architectures starting August 1:https://developer.android.com/distribute/best-practices/develop/64-bitFrom what we've noticed, the MusicKit is a 32-bit library, causing the inability to include it in our app on the Play Store. Any news on the support for this?
Posted Last updated
.