I've been unable to get AVAudioEngine connect:to:format: to work when using AVAudioFormat initWithCommonFormat:AVAudioPCMFormatInt16. The method always produces a
kAudioUnitErr_FormatNotSupported
error:ERROR: AVAudioNode.mm:521: AUSetFormat: error -10868
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'
What do I need to do to play a sound buffer in AVAudioPCMFormatInt16 format using AVAudioEngine?
AVAudioEngine *engine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
AVAudioFormat *format = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatInt16
sampleRate:(double)22050.
channels:(AVAudioChannelCount)1
interleaved:NO];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:format
frameCapacity:(AVAudioFrameCount)1024];
buffer.frameLength = (AVAudioFrameCount)buffer.frameCapacity;
memset(buffer.int16ChannelData[0], 0, buffer.frameLength * format.streamDescription->mBytesPerFrame); // zero fill
AVAudioMixerNode *mainMixer = [engine mainMixerNode];
// The following line results in a kAudioUnitErr_FormatNotSupported -10868 error
[engine connect:player to:mainMixer format:buffer.format];
[player scheduleBuffer:buffer completionHandler:nil];
NSError *error;
[engine startAndReturnError:&error];
[player play];
As background, my app needs to queue audio buffers generated by third party software to play sequentially. The audio buffers play fine (individually) using AVAudioPlayer. The AVAudioFormat settings in the above code come from inspecting the AVAudioPlayer settings property when playing a generated audio buffer. I am new to Core Audio and AVAudioEngine.