Keep getting kAudioUnitErr_TooManyFramesToProcess : inFramesToProcess=3072, mMaxFramesPerSlice=1120
when playing via bluetooth headphones.
Works fine when playing through mac speakers or hdmi output to tv.
I am using AVAudioEngine and AVAudioPlayerNode to play raw audio buffers scheduled to it.
It works fine except when bt headphones are connected then I get this error as soon as the AVAudioEngine is started - even before any buffers have been scheduled to play.
I have played around with kAudioUnitProperty_MaximumFramesPerSlice and kAudioDevicePropertyBufferFrameSize.
I can get the error to 'go away' by playing around with certain values but scheduled buffers don't play and I don't get a scheduled buffer completion callback.
No audio through the headphones.
I am totally stuck in how to proceed and what step to try next.
This is using swift language.
Many examples I see suggets setting kAudioUnitProperty_MaximumFramesPerSlice to the value returned by kAudioDevicePropertyBufferFrameSize.
But those code examples are in objc so I am struggling to find a swift equivalent.
Though I can set the mfps for the output node, I don't know how to set it for the other nodes in the 'tree'.
for example I am using the mixer to control volume and the examples show this:
AudioUnitSetProperty(mMixer, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFPS, sizeof(maxFPS));
However mMixer (audio_engine.mainMixerNode) is not an acceptable parameter to the swifts AudioUnitSetProperty() api.
So I am not sure I am configuring the nodes properly anyway.
The engine has a number of nodes as follows:
AVAudioPlayerNode->AVAudioUnitVarispeed->mainMixerNode->outputNode
I have removed the varispeed and mixer from the connections to simplify the setup so I have
AVAudioPlayerNode->outputNode
But still no luck with bt headphones, works fine through the mac speakers etc.
Totally stuck and hope someone can help to shed some light on what I am doing wrong.
Thanks.