Posts

Post not yet marked as solved
5 Replies
1.8k Views
Hi, I'm trying to display a string containing the number of the largest given time unit for the time since as given date - i.e. "2 Months Ago" or "1 Week Ago". DateComponentsFormatter appears to be a useful tool for this (the app targets iOS 10.2), however it doesn't seem to be formatting the number of weeks consistently. This is how I'm setting up the formatter: import Foundation let day = 60*60*24 let formatter = DateComponentsFormatter() formatter.calendar = Calendar.current formatter.allowedUnits = [.hour, .day, .weekOfMonth, .month] formatter.unitsStyle = .full formatter.maximumUnitCount = 1 formatter.allowsFractionalUnits = false formatter.zeroFormattingBehavior = .dropAll For the given range, I would expect 8-13 to use "1 week", but this doesn't seem to be the case: (0..40).forEach {   print("\($0): \(formatter.string(from: TimeInterval(day * $0))!)") } The actual output of this is: 0: 0 hours 1: 1 day 2: 2 days 3: 3 days 4: 4 days 5: 5 days 6: 6 days 7: 1 week 8: 2 weeks 9: 2 weeks 10: 2 weeks 11: 2 weeks 12: 2 weeks 13: 1 week 14: 2 weeks 15: 2 weeks 16: 2 weeks 17: 2 weeks 18: 2 weeks 19: 2 weeks 20: 2 weeks 21: 3 weeks 22: 3 weeks 23: 3 weeks 24: 3 weeks 25: 3 weeks 26: 3 weeks 27: 3 weeks 28: 4 weeks 29: 4 weeks 30: 4 weeks 31: 1 month 32: 1 month 33: 1 month 34: 1 month 35: 1 month 36: 1 month 37: 1 month 38: 1 month 39: 1 month Could anyone point me in the right direction?
Posted
by robham.
Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
Hi, I have an app attempting to do some video playback and editing, but I'm having an issue where with some videos, some operations like AVQueuePlayer/AVPlayerItem.seek will cause all players to move to .failed and just show black. The only indication the OS gives the app that this has happened is some logs relating to the haptic engine: [hcln]     AVHapticClient.mm:1309 -[AVHapticClient handleServerConnectionInterruption]: [xpc] Entered (due to connection interruption) for client ID 0x100031e [hapi]     CHHapticEngine.mm:614  -[CHHapticEngine finishInit:]_block_invoke: ERROR: Server connection broke with error 'The operation couldn’t be completed. (com.apple.CoreHaptics error -4811.) In the device console, the most likely culprit seems to be this (reported by kernal): EXC_RESOURCE -> mediaserverd[768] exceeded mem limit: ActiveSoft 2500 MB (non-fatal) If my assumption is correct, is it possible to mitigate this issue before mediaserverd and all the associated players get killed? My understanding of this is that AV memory is separate to the app memory so responding to the usual didReceiveMemoryWarningNotification isn't applicable in this case, and that notification doesn't seem to be being send before these failures. Thanks
Posted
by robham.
Last updated
.
Post not yet marked as solved
0 Replies
588 Views
Hi, I'm looking to update the minimum deployment target of my project to an iOS 12 version, and I'm wondering why Xcode doesn't provide iOS 12.0 as an option from the dropdowns - the earliest iOS 12 version it suggests is 12.1. I can set it directly in the plist as iOS 12.0 but I'm getting the impression that Xcode doesn't want me to use 12.0. Does anyone know why?
Posted
by robham.
Last updated
.
Post not yet marked as solved
0 Replies
756 Views
I'm looking to implement something fairly similar to instagram, where the app will switch to a .playback AVAudioSession when the user interacts with the hardware volume button. Observing the outputVolume of the AVAudioSession works for most cases except for where the user presses hardware volume up at max volume (or vice versa, but that's less of an issue), as the volume doesn't actually change. I can implement this functionality by observing AVSystemController_SystemVolumeDidChangeNotification but given that this is a private API I'd be risking the wrath of the app store review process which is something I'd like to avoid. Is there an approach I've missed which could work for me here? What are instagram's secrets?
Posted
by robham.
Last updated
.
Post not yet marked as solved
1 Replies
731 Views
Using an AVQueuePlayer to display a number of videos in sequence, there seems to be issues with transitioning between items with different aspect ratios. This is the view controller code: class ExampleLayerViewControllerT: PlayerView: UIViewController {       var playerView: T { view as! T }   var viewModel = LayerViewModel()   let buttonStack: UIStackView = {     /* make stack view */   }()   let playButton: UIButton = {     /* make button */   }()   let pauseButton: UIButton = {     /* make button */   }()   let skip: UIButton = {     /* make button */   }()   let previous: UIButton = {     /* make button */   }()       override func viewDidLoad() {     super.viewDidLoad()     addConstraints()     enqueue()   }       override func loadView() {     view = T()   }       func addConstraints() { /* add buttons to stack and put stack at bottom */   }   func enqueue() {     playerView.enqueue(with: viewModel.assets)   }       @objc func playButtonPressed() {     playerView.play()   }       @objc func pauseButtonPressed() {     playerView.pause()   }       @objc func previousButtonPressed() {     playerView.previous()   }       @objc func nextButtonPressed() {     playerView.skip()   } } The view and view protocol code: protocol PlayerView: UIView {   func play()       func pause()       func skip()       func previous()       func enqueue(with urls: [URL])       var state: PlayerState { get } } enum PlayerState {   case playWhenReady, pause, stop, interrupted } class QueuePlayerView: UIView, PlayerView {   var playerLayer: AVPlayerLayer? {     return layer as? AVPlayerLayer   }       var queuePlayer: AVQueuePlayer!       var state: PlayerState = .stop       init() {     super.init(frame: .zero)     setupQueuePlayer()   }       required init?(coder: NSCoder) {     fatalError("init(coder:) has not been implemented")   }       override static var layerClass: AnyClass {     return AVPlayerLayer.self   }       override func layoutSubviews() {     playerLayer?.frame = bounds   }       private func setupQueuePlayer() {     queuePlayer = AVQueuePlayer()     playerLayer?.player = queuePlayer     playerLayer?.videoGravity = .resizeAspectFill     queuePlayer?.actionAtItemEnd = .advance   }       func play() {     state = .playWhenReady     queuePlayer.play()   }       func pause() {     state = .pause     queuePlayer.pause()   }       func skip() {     queuePlayer.advanceToNextItem()   }       func previous() { /* yeah I know */     queuePlayer.seek(to: CMTime.zero)   }       func enqueue(with urls: [URL]) {     urls.forEach {       queuePlayer.insert(AVPlayerItem(url: $0), after: nil)     }   } } This is a transition between a portrait and landscape video, slowed down to 12% speed: giphy.com/gifs/P2pRbh6zgSlTuLnGXb Any suggestions on what I can do to avoid this? I've looked into switching video layers to transition between items with some success, but I'd rather not fight the framework if I can avoid it
Posted
by robham.
Last updated
.