AVAudioEngine connect:to:format: fails with error -10868 when using AVAudioPCMFormatInt16

I've been unable to get AVAudioEngine connect:to:format: to work when using AVAudioFormat initWithCommonFormat:AVAudioPCMFormatInt16. The method always produces a

kAudioUnitErr_FormatNotSupported
error:

ERROR: AVAudioNode.mm:521: AUSetFormat: error -10868

*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'


What do I need to do to play a sound buffer in AVAudioPCMFormatInt16 format using AVAudioEngine?


    AVAudioEngine *engine = [[AVAudioEngine alloc] init];
    AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
    [engine attachNode:player];

    AVAudioFormat *format = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatInt16
                                                             sampleRate:(double)22050.
                                                               channels:(AVAudioChannelCount)1
                                                            interleaved:NO];
    AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:format
                                                             frameCapacity:(AVAudioFrameCount)1024];
    buffer.frameLength = (AVAudioFrameCount)buffer.frameCapacity;
    memset(buffer.int16ChannelData[0], 0, buffer.frameLength * format.streamDescription->mBytesPerFrame); // zero fill
    AVAudioMixerNode *mainMixer = [engine mainMixerNode];

// The following line results in a kAudioUnitErr_FormatNotSupported -10868 error
    [engine connect:player to:mainMixer format:buffer.format];

    [player scheduleBuffer:buffer completionHandler:nil];
    NSError *error;
    [engine startAndReturnError:&error];
    [player play];


As background, my app needs to queue audio buffers generated by third party software to play sequentially. The audio buffers play fine (individually) using AVAudioPlayer. The AVAudioFormat settings in the above code come from inspecting the AVAudioPlayer settings property when playing a generated audio buffer. I am new to Core Audio and AVAudioEngine.

Replies

I've used the same sample code as submitted above and I'm getting exactly the same issue, I can't connect the `AVAudioPlayerNode` to the audio engine with the `AVAudioPCMFormatInt16` format, and sadly there is no clue why. The documentation has no mention that it's not possible...


Me too, I'm new to Core Audio, so I might be skipping an important step here.


Maybe you found the solution since then @risingtide ?


PS: as soon as I've found it, I'll post the solution here.

The AVAudioPCMFormatInt16 format (pcmFormatInt16 in Swift) seems to be unsupported.

I succeeded with swift by converting the input to the pcmFormatFloat32 format, i.e. divide convert each Int16 sample by 32768, obtaining a float number between -1.0 and 1.0

I meet the same question, can you show me detail method?

You can use singed 16 bit audio format, but you should convert it at first.

Code Block swift
// Setup your own format
let inputFormat = AVAudioFormat(
commonFormat: .pcmFormatInt16,
sampleRate: 44100,
channels: AVAudioChannelCount(2),
interleaved: true
)!
let engine = AVAudioEngine()
// Use system format as output format
let outputFormat = engine.mainMixerNode.outputFormat(forBus: 0)
self.converter = AVAudioConverter(from: inputFormat, to: outputFormat)!
self.playerNode = AVAudioPlayerNode()
engine.attach(playerNode)
engine.connect(playerNode, to: engine.mainMixerNode, format: nil)
...
// Prepare input and output buffer
let inputBuffer = AVAudioPCMBuffer(pcmFormat: inputFormat, frameCapacity: maxSamplesPerBuffer)!
let outputBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: maxSamplesPerBuffer)!
// When you fill your Int16 buffer with data, send it to converter
self.converter.convert(to: outputBuffer, error: nil) { inNumPackets, outStatus in
outStatus.pointee = .haveData
return inputBuffer
}
// Now in outputBuffer sound in system format and we can play it
self.playerNode.scheduleBuffer(outputBuffer)