Posts

Post not yet marked as solved
3 Replies
1.6k Views
I think CAMetalLayer.nextDrawable() shouldn't be very time-consuming, it should be less than 1ms. But occasionally, it will take more than 5ms, just 7ms~13ms, which is a very long time. How can we optimize this? Since this will cause rendering junk.....
Posted
by luckysmg.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
When set metalLayer.presentsWithTransaction = true, and we call drawable.present() in bg thread (our rendering thread is not main thread), the app sometimes will crash with: *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Modifications to the layout engine must not be performed from a background thread after it has been accessed from the main thread.' Any solution?
Posted
by luckysmg.
Last updated
.
Post not yet marked as solved
1 Replies
694 Views
The top padding of UINavigationBar 's value is UIApplication.shared.statusBarFrame.height or UIApplication.shared.keyWindow?.safeAreaInsets.top ? Here is image See the blue box of the image https://github.com/luckysmg/daily_images/blob/main/20220928153515.jpg?raw=true Because two value is the same on devices older than iPhone14 Pro but is not the same on iPhone14 Pro and iPhone 14 Pro max. Which is the right value to get?
Posted
by luckysmg.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
I add two view on screen. One is the blue view, other is orange view. And I use CADisplayLink to change orange view 's frame according to the blueView.layer.presentation.frame As expected, the orange view should cover the blue view completely, but we can see a gap and we can see a little part of blue view during keyboard animation. Here is demo code: class SecondViewController: UIViewController {       let textField = UITextField()       override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {     self.textField.resignFirstResponder()   }       lazy var displayLink = CADisplayLink(target: self, selector: #selector(self.onDisplayLink))       @objc private func onDisplayLink () {     self.realBox.frame = CGRect(x: self.realBox.frame.origin.x, y: self.blueBox.layer.presentation()!.frame.origin.y, width: self.view.bounds.width, height: 100)   }       let blueBox = UIView()   let realBox = UIView()       override func viewDidLoad() {     super.viewDidLoad()     self.view.backgroundColor = .white           textField.frame = .init(x: 100, y: 100, width: 100, height: 100)     textField.backgroundColor = .red     self.view.addSubview(textField)           realBox.backgroundColor = .orange           blueBox.backgroundColor = .blue     blueBox.frame = .init(x: 0, y: self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)     self.view.addSubview(blueBox)     self.view.addSubview(realBox)     realBox.frame = .init(x: 0, y:self.view.bounds.height - 100 , width: self.view.bounds.width, height: 100)           NotificationCenter.default.addObserver(forName: UIResponder.keyboardWillChangeFrameNotification, object: nil, queue: .main) { noti in       let userInfo = noti.userInfo!       let endFrame = userInfo[UIResponder.keyboardFrameEndUserInfoKey] as! CGRect       let isOpen = endFrame.intersects(self.view.bounds)       self.blueBox.frame = .init(x: 0, y: isOpen ? self.view.bounds.height - 100 - endFrame.height : self.view.bounds.height - 100, width: self.view.bounds.width, height: 100)     }           if #available(iOS 15.0, *) {       self.displayLink.preferredFrameRateRange = .init(minimum: 60, maximum: 120, preferred: 120)     } else {       self.displayLink.preferredFramesPerSecond = 120     }     self.displayLink.add(to: .main, forMode: .common)   } } So how to get an accurate value of the layer currently displayed on screen? Here is the video: https://github.com/luckysmg/daily_images/blob/main/RPReplay_Final1661168764.mov?raw=true
Posted
by luckysmg.
Last updated
.
Post marked as solved
1 Replies
1.3k Views
I have set CADisableMinimumFrameDurationOnPhone to YES And My device is iPhone13 Pro. The refresh rate is be 120HZ max. But the touchesMoved callback call with a frequency of 60HZ (16.66ms) in UIViewController It should be 120HZ (0.008s) on IPhone 13 pro Here is test code: import SwiftUI class HomeViewController: UIViewController,UIScrollViewDelegate {       override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {     let time = CFAbsoluteTimeGetCurrent()     print(time - oldTime) // 0.016s. But it should be 0.008s on iPhone13Pro     oldTime = time   }       var oldTime:CFAbsoluteTime = 0       override func viewDidLoad() {     super.viewDidLoad()     self.view.backgroundColor = .red   } } So How can I increase this frequency to 120HZ ?.
Posted
by luckysmg.
Last updated
.
Post marked as solved
2 Replies
836 Views
I use this method to judge this device whether supports the captureTextFromCamera API : let tf = UITextField() tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil) This will return false on iPhoneX and return true on iPhone13 Pro So I want to know: On iPhone13Pro tf.canPerformAction(#selector(UIResponder.captureTextFromCamera(_:)),withSender: nil) Can this method return the false ? Or this value won't change for every device? Can I assume that this method will always return true on iPhone13 Pro and will always false on iPhoneX?
Posted
by luckysmg.
Last updated
.
Post not yet marked as solved
0 Replies
1k Views
This is the crash log from Firebase. Fatal Exception: NSInvalidArgumentException *** -[AVAssetWriter addInput:] Format ID 'lpcm' is not compatible with file type com.apple.m4a-audio But I can't reproduce the crash ... This is the demo code Does anyone know where the problem is ? let normalOutputSettings:[String:Any] = [     AVFormatIDKey : kAudioFormatLinearPCM,     AVSampleRateKey : 44100,     AVNumberOfChannelsKey : 2,     AVLinearPCMBitDepthKey : 16,     AVLinearPCMIsNonInterleaved : false,     AVLinearPCMIsFloatKey : false,     AVLinearPCMIsBigEndianKey : false   ] let writerInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings) let outputURL = URL(fileURLWithPath: NSTemporaryDirectory() + UUID().uuidString + ".m4a") self.writer = try! AVAssetWriter(outputURL: outputURL, fileType: fileType) writer?.add(writerInput)
Posted
by luckysmg.
Last updated
.
Post not yet marked as solved
0 Replies
1k Views
The GarageBand app can import both midi and recorded audio file into a single player to play. Just like this: My App have the same feature but I don't know how to implement it. I have tried the AVAudioSequencer,but it only can load and play MIDI file. I have tried the AVPlayer and AVPlayerItem,but it seems that it can't load the MIDI file. So How to combine MIDI file and audio file into a single AVPlayerItem or anything else to play?
Posted
by luckysmg.
Last updated
.
Post not yet marked as solved
0 Replies
1.2k Views
I have an AVMutableAudioMix and use MTAudioProcessingTap to process the audio data.But After I pass the buffer to AVAudioEngine and to render it with renderOffline,the audio has no any effects...How can I do it? Any idea? Here is the code for MTAudioProcessingTapProcessCallback var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:UnsafeMutableRawPointer(Unmanaged.passUnretained(self.engine).toOpaque()), init: tapInit, finalize: tapFinalize, prepare: tapPrepare, unprepare: tapUnprepare) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in                       guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {         preconditionFailure()       }       let storage = MTAudioProcessingTapGetStorage(tap)       let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()       // render the audio with effect       engine.render(bufferPtr: bufferListInOut,numberOfFrames: numberFrames)     } And here is the Engine code class Engine {   let engine = AVAudioEngine()       let player = AVAudioPlayerNode()   let pitchEffect = AVAudioUnitTimePitch()   let reverbEffect = AVAudioUnitReverb()   let rateEffect = AVAudioUnitVarispeed()   let volumeEffect = AVAudioUnitEQ()   let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 2, interleaved: false)!   init() {     engine.attach(player)     engine.attach(pitchEffect)     engine.attach(reverbEffect)     engine.attach(rateEffect)     engine.attach(volumeEffect)           engine.connect(player, to: pitchEffect, format: format)     engine.connect(pitchEffect, to: reverbEffect, format: format)     engine.connect(reverbEffect, to: rateEffect, format: format)     engine.connect(rateEffect, to: volumeEffect, format: format)     engine.connect(volumeEffect, to: engine.mainMixerNode, format: format)           try! engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: 4096)           reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.largeRoom2)     reverbEffect.wetDryMix = 100     pitchEffect.pitch = 2100           try! engine.start()     player.play()   }       func render(bufferPtr:UnsafeMutablePointer<AudioBufferList>,numberOfFrames:CMItemCount) {     let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 4096)!     buffer.frameLength = AVAudioFrameCount(numberOfFrames)     buffer.mutableAudioBufferList.pointee = bufferPtr.pointee     self.player.scheduleBuffer(buffer) {       try! self.engine.renderOffline(AVAudioFrameCount(numberOfFrames), to: buffer)     }   } }
Posted
by luckysmg.
Last updated
.
Post marked as solved
1 Replies
1.1k Views
I wrote the code below to pass my custom object Engine in MTAudioProcessingTapCallbacks Here is code: func getTap() -> MTAudioProcessingTap? {     var tap: Unmanaged<MTAudioProcessingTap>?           func onInit(tap:MTAudioProcessingTap,clientInfo:UnsafeMutableRawPointer?,tagStroageOut:UnsafeMutablePointer<UnsafeMutableRawPointer?>) {               let engine = Engine()       tagStroageOut.pointee = Unmanaged<Engine>.passUnretained(engine).toOpaque()     }                 var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:nil, init: onInit, finalize: nil, prepare: nil, unprepare: nil) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in               guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {         preconditionFailure()       }               let storage = MTAudioProcessingTapGetStorage(tap)       let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()               // This line crashed :       // ClientProcessingTapManager (14): EXC_BAD_ACCESS (code=1, address=0x544453e46ea0)       engine.dealWith(bufferPtr: bufferListInOut)     }     guard MTAudioProcessingTapCreate(kCFAllocatorDefault, &callback, kMTAudioProcessingTapCreationFlag_PostEffects, &tap) == noErr else{       fatalError()     }     return tap?.takeRetainedValue()   } } How can I do it?
Posted
by luckysmg.
Last updated
.