Posts

Post marked as solved
1 Replies
1.5k Views
I would like someone's help to understand the correct use of weak self in nested escaping blocks. It would be great if someone could tell me whether what I'm doing below is correct. Thank you. import UIKit class MyClass {   var closingTimer: Timer?       func f1() {     /**      Some work here       */   }       func f2() {     closingTimer = Timer.scheduledTimer(withTimeInterval: TimeInterval.init(5), repeats: true, block: { [weak self] (t: Timer) in       /**        Some work here         */               if let _ = self {         DispatchQueue.main.async { [weak self] in           self?.f1()         }       }     })           } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
In my app I'm loading a website in a WKWebView. This website opens a WebSocket connection (CometD3). My problem is that the connection drops as soon as the app enters background. I am aware that Apple requires developers to request special capabilities to keep socket connections opened in the background. My app doesn't fit the description of an app that Apple may allow this. The requirement is not to keep it alive forever but to avoid reconnection when the user switches between apps or if they happen to be interrupted (e.g. phone call, text message). I'm look at about 30sec here. I have already tried https://developer.apple.com/documentation/uikit/app_and_environment/scenes/preparing_your_ui_to_run_in_the_background/extending_your_app_s_background_execution_time but didn't work. Also, I did not have this problem with UIWebView.
Posted Last updated
.
Post not yet marked as solved
0 Replies
1k Views
I have this theory that AVMutableComposition does not work without a video track. I'm trying to change the speed of an audio track (mp3) using thefunc scaleTimeRange(_ timeRange: CMTimeRange, toDuration duration: CMTime)function. Below is the code I have so far,static func changeSpeed(audio: AVURLAsset, factor: Float64, outputURL: URL, completionHandler:@escaping ((_ success: Bool) -> Void)) { print(#function + " started: " + Date().description) guard factor > 0 else { DispatchQueue.main.async { print(#function + ": Speed factor must be a positive value") completionHandler(false) } return } guard let assetAudioTrack = audio.tracks(withMediaType: .audio).first else { DispatchQueue.main.async { print(#function + ": No asset track found") completionHandler(false) } return } let composition = AVMutableComposition() let audioTrack = composition.addMutableTrack( withMediaType: AVMediaType.audio, preferredTrackID: kCMPersistentTrackID_Invalid ) do { try audioTrack?.insertTimeRange( CMTimeRangeMake(start: CMTime.zero, duration: audio.duration), of: assetAudioTrack, at: CMTime.zero ) } catch { DispatchQueue.main.async { print(#function + ": Failed to add " + audio.url.absoluteString) completionHandler(false) } return } let exporter = AVAssetExportSession( asset: composition, presetName: AVAssetExportPresetAppleM4A ) exporter?.outputURL = outputURL exporter?.outputFileType = AVFileType.m4a audioTrack?.scaleTimeRange(CMTimeRange.init(start: CMTime.zero, end: audio.duration), toDuration: CMTimeMultiplyByFloat64(audio.duration, multiplier: 1/factor)) exporter?.exportAsynchronously { if exporter?.status == .completed { DispatchQueue.main.async { completionHandler(true) print(#function + " ended: " + Date().description) } } else { DispatchQueue.main.async { completionHandler(false) print(#function + " ended with error: " + Date().description) if let errorString = exporter?.error.debugDescription { print(errorString) } } } } }Above code fails with or without the *scaleTimeRange* function. That is why I'm thinking that AVMutableComposition will work only when there are video tracks.I did play around a little by changing the preset and output file type. Below are my observations,preset: AVAssetExportPresetAppleM4Afile type: AVFileType.m4aError Domain=AVFoundationErrorDomain Code=-11800 "The operation couldnot be completed" UserInfo={NSLocalizedFailureReason=An unknown erroroccurred (-12780), NSLocalizedDescription=The operation could not becompleted, NSUnderlyingError=0x281164960 {ErrorDomain=NSOSStatusErrorDomain Code=-12780 "(null)"}}preset: AVAssetExportPresetPassthroughfile type: AVFileType.m4aError Domain=AVFoundationErrorDomain Code=-11822 "Cannot Open" UserInfo={NSLocalizedFailureReason=This media format is not supported., NSLocalizedDescription=Cannot Open, NSUnderlyingError=0x28178fc30 {Error Domain=NSOSStatusErrorDomain Code=-16976 "(null)"}}I printed the supported file types of AVAssetExportSession when the preset is AVAssetExportPresetPassthrough. print(#function + ": Supported file types -> " + String.init(format: "%@", exporter?.supportedFileTypes ?? []))Below is the output of the above. I can clearly see m4a in the list. "com.apple.quicktime-movie", "com.apple.m4a-audio", "public.mpeg-4", "com.apple.m4v-video", "public.3gpp", "org.3gpp.adaptive-multi-rate-audio", "com.microsoft.waveform-audio", "public.aiff-audio", "public.aifc-audio", "com.apple.coreaudio-format"I need help to build a function that would change the speed of an audio track for me. The audio track could be an asset in the bundle or picked from the MPMediaPickerController.Thanks!
Posted Last updated
.