I have an app that uses AVAudioEngine and connects an AVAudioPlayer node to the engine's outputNode. The app works and the audio plays fine.
Wanting to do something more complex, I decided to use the mainMixerNode instead of the outputNode property of the engine. With this one simple change, the audio no longer plays. Instead, the app crashes with:
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'player started when in a disconnected state'
Can someone explain the cause of this error and the right way to use the mainMixerNode? Does it have to be configured or initialized in some way? Do I have to attach it to the engine, or connect it to the outputNode?
The documentation says: "When the property is first accessed the audio engine constructs a singleton main mixer and connects it to the outputNode on demand. You can then connect additional audio nodes to the mixer." The part of that which I find confusing is the part that says "on demand". What does that mean -- connects it "on demand"? Do I have to make this demand somehow? How?
If you paste relevant code then maybe I can help.
I also get a C++ exception simply by using mainMixerNode instead of outputNode but this seems to be "normal". Or any other instance of AVAudioMixerNode.
Your error message however seems to indicate a different problem.