It's just ridiculous that this issue still exists.
Post
Replies
Boosts
Views
Activity
Noone actually has solved this. it still exist.
This issue still exists.
THis is what Apples says...
It is an internaly thing, we cannot do anything about that.
UIWindow + NSWindow = UINSWindow
If you’ve done some iOS development, you’re familiar with UIWindow. On iOS, these generally do not represent anything UI-related and are used to separate different view hierarchies. NSWindow is a similar concept that has existed on the Mac since the beginning. Windows are native to the Mac and are a much more natural abstraction there, compared to iOS.
To bridge UIWindow to NSWindow when using Mac Catalyst and allow you to present your scenes as familiar Mac windows, Apple created a different class — called UINSWindow — that is used for Catalyst only. This is a private class, so you cannot use it directly — and even if you try, your app will get rejected by the App Store Review.
A UINSWindow instance conveniently inherits from NSWindow, so it supports all the functionality that a standard AppKit window supports. This suggests that there is a possibility to somehow mix the UIKit and the AppKit layers together in one view hierarchy.
Is this annoying problem solved yet, Apple ???
Any AudioUnit version 3with an user interface in SWIFT will actually endlessly produce this kind of none-descriptive logging junk.
Addendums:
What actually bothers me much more is the general inconsistency of the entire AVAudioEngine implementation, after all those years. And the many missing, but necessary API functions inside the framework.
In the case of the bespoken MIDI processors :
Why do I get a forced hard crash, if I try to connect a music effect unit (understanding MIDI messages perfectly) with a MIDI processor unit ???
This is unlogical and also described very differently inside the API documentation. It actually should and also must work that way, as music effects are expecting MIDI messages to work, otherwise these are useless.
This and the entire s**tload of crash concert conduction reports, with just endless, ridiculously annoying, cryptic error messages without any real meaning drive me crazy.
Why for instance is there no function to call up the current connected items of an AVAudioNode instance and similar, very basic things, required for proper programming?
This entire thing with the AVFoundation is a kind of highly inconsistent opaque puzzle game that unfortunately does not allow to code complex professional grade audio apps, on none of Apples platforms, by the way. If you want to do so, you probably must build on something completely different, from ground up.
As soon a project becomes more complex with AV than just connecting a simple audio player or sampler with some additional effects for playback or such, it gets into a complete dysfunctional nightmare of randomly acting audio blackboxes with some ‚gone crazy‘ conductor, giving the impression, this all was merely made for joking.
The opaqueness and the limitness of this framework (like many other Apple frameworks too) is near unacceptable. But an alternative framework for audio apps with Swift from Apple is just not available, nor planned or marked deprecated.
This all is basically a kind of waste of precious development time, I had to discover, lately and unfortunately …
.
A MIDI processor node in AVAudioEngine needs an active output (MIDI) connection, rather than merely the usual audio connection as with other generator or effect nodes. So this may be the issue for many developers here. This seems to be also logical. It does not make any sense to call the rendering block of this MIDI processor, if the units does not actively connect to something via MIDI, as it merely generates MIDI events. If it isn‘t connected this way, it does not need to be called in any render cycle. ^^
So connecting it with only connect(..) probably does not work, it must be connected with connectMIDI(..) to something that actually is connected to an active audio signal flow.
And more:
I made the test (rather discovered by accident) that a midi processor node does not even need any real audio connection (with connect()) but merely be attached to the engine. So in fact, the output bus of that unit does not even need to be connected to the engine at all!
But as soon you make a connection to an active receiver unit, that currently renders, so it will start getting called with this render cycle at all.
I assume this ist just the way it is intended to work…
And I also do assume, that is is internally attached to the MIDI destination’s node rendering cycle and busses ( i.e. for sharing the correct samplerate - which is required for correct MIDI event processing ).
By the way, this addition to the AVAudioEngine with multiple connection points does not make much sense either. In practice you won’t do it that way. In practice you want multiple MIDI destinations merely with senseful filtering and routing mechanisms. And therefore this multi connection feature is basically unthought nonsense. However it does make sense, if single connections are required.
.