Addendums:
What actually bothers me much more is the general inconsistency of the entire AVAudioEngine implementation, after all those years. And the many missing, but necessary API functions inside the framework.
In the case of the bespoken MIDI processors :
Why do I get a forced hard crash, if I try to connect a music effect unit (understanding MIDI messages perfectly) with a MIDI processor unit ???
This is unlogical and also described very differently inside the API documentation. It actually should and also must work that way, as music effects are expecting MIDI messages to work, otherwise these are useless.
This and the entire s**tload of crash concert conduction reports, with just endless, ridiculously annoying, cryptic error messages without any real meaning drive me crazy.
Why for instance is there no function to call up the current connected items of an AVAudioNode instance and similar, very basic things, required for proper programming?
This entire thing with the AVFoundation is a kind of highly inconsistent opaque puzzle game that unfortunately does not allow to code complex professional grade audio apps, on none of Apples platforms, by the way. If you want to do so, you probably must build on something completely different, from ground up.
As soon a project becomes more complex with AV than just connecting a simple audio player or sampler with some additional effects for playback or such, it gets into a complete dysfunctional nightmare of randomly acting audio blackboxes with some ‚gone crazy‘ conductor, giving the impression, this all was merely made for joking.
The opaqueness and the limitness of this framework (like many other Apple frameworks too) is near unacceptable. But an alternative framework for audio apps with Swift from Apple is just not available, nor planned or marked deprecated.
This all is basically a kind of waste of precious development time, I had to discover, lately and unfortunately …
.