I'm trying to dynamically modify a graph of nodes in attached to an AVAudioEngine. The graph correctly outputs sound but then outputs silence as soon as I call connect(_:to:format:) .In the below sample, I'd like to dynamically connect player to engine.mainMixerNode, but whenever I call toggleBypass I get silence.Is it possible to carry out this rewiring without pausing playback of the AVAudioPlayerNode?class Sample: UIViewController {
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let effectNode = AVAudioUnitDelay()
@objc func toggleBypass() {
if effectNode.numberOfInputs == 0 {
engine.connect(player, to: effectNode, format: file.processingFormat)
engine.connect(effectNode, to: engine.mainMixerNode, format: file.processingFormat)
} else {
engine.connect(player, to: engine.mainMixerNode, format: file.processingFormat)
}
}
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .red
view.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(toggleBypass)))
try! AVAudioSession.sharedInstance().setCategory(.playback)
try! AVAudioSession.sharedInstance().setPreferredIOBufferDuration(0.005)
try! AVAudioSession.sharedInstance().setActive(true, options: [])
do {
engine.attach(player)
engine.attach(effectNode)
engine.connect(player, to: effectNode, format: file.processingFormat)
engine.connect(effectNode, to: engine.mainMixerNode, format: file.processingFormat)
player.scheduleBuffer(buffer, at: nil, options: .loops, completionHandler: nil)
engine.prepare()
try engine.start()
player.play()
} catch {
assertionFailure(String(describing: error))
}
}
lazy var file: AVAudioFile = {
let fileURL = Bundle.main.url(forResource: "filename", withExtension: "mp3")!
return try! AVAudioFile(forReading: fileURL)
}()
lazy var buffer: AVAudioPCMBuffer = {
let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: UInt32(file.length))!
try! file.read(into: buffer)
return buffer
}()
}
Post
Replies
Boosts
Views
Activity
I am running into runtime errors while trying to process microphone audio input, using AVAudioEngine. Here's the code to reproduce:
let bufferSize: AVAudioFrameCount = 2016
let mySampleRate: Double = 8000
let audioFormat = AVAudioFormat(standardFormatWithSampleRate: mySampleRate, channels: 1)
try! AVAudioSession.sharedInstance().setPreferredSampleRate(mySampleRate)
assert(AVAudioSession.sharedInstance().sampleRate == mySampleRate)
engine.inputNode.installTap(onBus: 0, bufferSize: bufferSize, format: audioFormat) { _, _ in
}
try! engine.start()
This crashes at runtime with this error:
Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate'
AVAudioSession.sharedInstance().sampleRate is 8000 as per my assert, so that's also what the hardware sample rate should be.
It seems like AVAudioSession.sharedInstance().setPreferredSampleRate(mySampleRate) has no effect.
According to Xcode, my app is using up about 23% of CPU:
This seems consistent with its CPU usage indicated by Activity Monitor:
Now if you look at the bottom section of the Activity Monitor screenshot, you'll see it's indicating about 5% User CPU usage, i.e. "The percentage of CPU capability that’s being used by apps you opened, or by the processes opened by those apps."
This looks incoherent. If the app is taking up 23% CPU, why is User CPU usage 5%?
Hello,
My app runs fine on Monterey, but on Ventura we get a crash upon launch with the following Termination Reason:
Termination Reason: Namespace DYLD, Code 1 Library missing
Library not loaded: @rpath/libmylibrary-sdk.dylib
Referenced from: <B2C025A7-2A52-3D29-93CF-EB0AE93077AA> /Applications/MyApp.app/Contents/MacOS/MyApp
Reason: tried: '/usr/lib/swift/libmylibrary-sdk.dylib' (no such file, not in dyld cache), '/System/Volumes/Preboot/Cryptexes/OS/usr/lib/swift/libmylibrary-sdk.dylib' (no such file), '/usr/lib/swift/libmylibrary-sdk.dylib' (no such file, not in dyld cache), '/System/Volumes/Preboot/Cryptexes/OS/usr/lib/swift/libmylibrary-sdk.dylib' (no such file), '/System/Volumes/Preboot/Cryptexes/OS@rpath/libmylibrary-sdk.dylib' (no such file), '/usr/lib/swift/libmylibrary-sdk.dylib' (no such file, not in dyld cache), '/System/Volumes/Preboot/Cryptexes/OS/usr/lib/swift/libmylibrary-sdk.dylib' (no such file), '/usr/lib/swift/libmylibrary-sdk.dylib' (no such file, not in dyld cache), '/System/Volumes/Preboot/Cryptexes/OS/usr/lib/swift/libmylibrary-sdk.dylib' (no such file), '/usr/lib/libmylibrary-sdk.dylib' (no such file, not in dyld cache)
(terminated at launch; ignore backtrace)
My app depends on mylibrary-sdk.dylib, which I've included in the target using SPM. Again, this setup runs produces an app that runs fine on Monterey.
This is very odd, because it seems to be looking for mylibrary-sdk.dylib in /usr/lib/swift/ and /usr/lib/libmylibrary-sdk.dylib ?! If I look at the binary, the dylib is indeed in its Frameworks directory. I've also added @executable_path/../Frameworks to my Runpath Search Paths but that didn't help.