AVAudioEngine and Multiroute

Some developers have asked how to setup multiroute channel mapping with AVAudioEngine when using multichannel output hardware.


Currently, to get multi routing working with AVAudioEngine the application needs to create and set a custom channel map on the outputNode Audio Unit via the C API.


We've provided some sample code below demonstrating how an application can accomplish this task. Doing so allows you to route specific channels to various destinations.

For more information see WWDC 2012 Session 505 Audio Session and Multiroute Audio in iOS.


For example, a standard channel map would look like this:

AURemoteIO Client Format 2 Channels: 0, 1
                                     |  |
Output Audio Unit Channel Map      : 0, 1, -1, -1
                                     |  |
AURemoteIO Output Format 4 Channels: 0, 1,  2,  3
                                    L  R HDMI1 HDMI2


A custom channel map routing audio to HDMI 1 & HDMI 2 would look like this:

AURemoteIO Client Format 2 Channels: 0, 1
                                     |  |_______
                                     |_______   |
                                             |  |
Output Audio Unit Channel Map      : -1, -1, 0, 1
                                             |  |
AURemoteIO Output Format 4 Channels: 0,  1,  2, 3
                                    L  R  HDMI1 HDMI2


Sample Custom Channel Map example code:


//  configureAVAudioEngine() assume the inChannelNames array contains user
//  specified channel names used by getOutputChannelMapIndices() to create
//  the channel map index.
//
// Example:
//
// NSArray *inChannelNames = [NSArray arrayWithObjects:
//                              @"Headphones Left",
//                              @"Headphones Right",
//                              @"HDMI 2",
//                              nil];

NSMutableArray* getOutputChannelMapIndices(NSArray *inChannelNames)
{
    AVAudioSession *session = [AVAudioSession sharedInstance];
    AVAudioSessionRouteDescription *route = [session currentRoute];
    NSArray *outputPorts = [route outputs];

    NSMutableArray *channelMapIndices = [NSMutableArray array];
    int chIndex = 0;

    for (NSString *inChName in inChannelNames) {
        chIndex = 0;
        for (AVAudioSessionPortDescription *outputPort in outputPorts) {
            for (AVAudioSessionChannelDescription *channel in outputPort.channels) {
                if ([channel.channelName isEqualToString:inChName]) {
                    if ([inChannelNames count] > [channelMapIndices count]) {
                        [channelMapIndices addObject:[NSNumber numberWithInt:chIndex]];
                    }
                }
                chIndex++;
            }
        }
    }

    return channelMapIndices;
}

- (void)configureAVAudioEngine(NSArray *inChannelNames)
{
    // ------------------------------------------------------------------
    // AVAudioSession setup
    // ------------------------------------------------------------------

    AVAudioSession *sessionInstance = [AVAudioSession sharedInstance];
    NSError *error = nil;

    // set the session category
    bool success = [sessionInstance setCategory:AVAudioSessionCategoryMultiRoute withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&error];
    NSAssert(success, @"Error setting AVAudioSession category! %@", [error localizedDescription]);

    // activate the audio session
    success = [sessionInstance setActive:YES error:&error];
    NSAssert(success, @"Error setting AVAudioSession active! %@", [error localizedDescription]);

    // Get channel map indices based on user specified channelNames
    NSMutableArray *channelMapIndices = getOutputChannelMapIndices(inChannelNames);
    NSAssert(channelMapIndices && channelMapIndices.count > 0, @"Error getting indices for user specified channel names!");

    // ------------------------------------------------------------------
    // AVAudioEngine setup
    // ------------------------------------------------------------------

    _engine = [[AVAudioEngine alloc] init];
    _output = _engine.outputNode;
    _mixer = _engine.mainMixerNode;

    _player = [[AVAudioPlayerNode alloc] init];
    [_engine attachNode:_player];

    // open the file to play
    AVAudioFile *file = [self getFileToPlay:@"lpcm_multichannel ofType:@"caf"];  // multichannel audio file

    // create output channel map
    UInt32 outputNumChannels = [_output outputFormatForBus:0].channelCount;
    NSAssert(outputNumChannels > 0 && outputNumChannels <= 512, @"Error: invalid number of output channels!"); // reasonable bounds

    SInt32 outputChannelMap[outputNumChannels];
    memset(outputChannelMap, -1, sizeof(outputChannelMap)); // unmapped

    SInt32 sourceNumChannels = (SInt32)file.processingFormat.channelCount;
    SInt32 sourceChIndex = 0;
    for (id chIndex in channelMapIndices) {
        int chIndexVal = [(NSNumber*)chIndex intValue];

        if (chIndexVal < outputNumChannels && sourceChIndex < sourceNumChannels) {
            outputChannelMap[chIndexVal] = sourceChIndex;
            sourceChIndex++;
        }
    }

    // set channel map on outputNode AU
    UInt32 propSize = (UInt32)sizeof(outputChannelMap);
    OSStatus err = AudioUnitSetProperty(_output.audioUnit, kAudioOutputUnitProperty_ChannelMap, kAudioUnitScope_Global, 0, outputChannelMap, propSize);
    NSAssert(noErr == err, @"Error setting channel map! %d", (int)err);

    // make connections
    AVAudioChannelLayout *channelLayout = [[AVAudioChannelLayout alloc]
                                            initWithLayoutTag:kAudioChannelLayoutTag_DiscreteInOrder | (UInt32)sourceNumChannels];
    AVAudioFormat *format = [[AVAudioFormat alloc]
                              initWithStreamDescription:file.processingFormat.streamDescription
                              channelLayout:channelLayout];

    [_engine connect:_player to:_mixer format:format];
    [_engine connect:_mixer to:_output format:format];

    // schedule the file on player
    [_player scheduleFile:file atTime:nil completionHandler:nil];

    // start engine and player
    success = [_engine startAndReturnError:&error];
    NSAssert(success, @"Error starting engine! %@", [error localizedDescription]);

    [_player play];
}

That is exactly what I was looking for for days! Thank you so much!!

I tried the example above, but I couldn't get the multi-route working as I was expecting. In my case, what I need to achieve is to be able to either:

  1. Play a multi-channel audio file with 4 channels, where 2 channels should be played through the headphones, while the two other channels should be forwarded to HDMI/USB output, or..
  2. Play two stereo audio files, where one file should be played through the headphones, wile the other should be played through the HDMI/USB output.


I tried to implement both cases, but it always ends up with only playing through either headphones or HDMI/USB, but never simultaneously. I made sure to check the audio session category before starting the playback, and it is always "AVAudioSessionCategoryMultiRoute". In addition, channel mapping is done correctly, and all the output channels are active:

Output Channel Map: [0, 1, 2, 3]


I followed WWDC 2012 Session 505, and the demo they showed is almost what I want to achieve, but unfortunately the sample code of the demo is published nowhere!


My question is: Is it possible to achieve one of the scenarios, I mentioned above, using AVAudioEngine API's ?


Note:

- I implemented the above example in Swift 1.2 and made sure it works for a simple playbak scenario.

- I used SDK version 8.4 + Xcode 6.4

- Tested with an iPad Air 2.

For those who might come by the same situation, the code example by @theanalogkid, above, WORKS! It didn't for me for two reasons:


1- I was triyng to play an AAC-encoded audio file (file format "m4a"), and this made the code raise an error, without crashing the App, so I ovelooked it between the debugging lines (also, error: "AVAudioFile.mm:86: AVAudioFileImpl: error 1718449215" is quite cryptic and it took me few minutes searching the web to find out that it means: kAudioFormatUnsupportedDataFormatError)


2- The audio file was sampled at 48kHz, but the HDMI hardware, I was using, does not support that sample rate, so I couldn't get anything out through the HDMI route. My HDMI hardware is a combination of Apple's lightning/HDMI adapter + an HDMI ports switcher with an audio output; but, I didn't have time to investigate which of them does not support a sample rate of 48kHz.


So, when I play a raw audio file ("caf") sampled at 44.1kHz, the code above work as expected 🙂


Thank you @theanalogkid for the code, and I'll apreciate it if you have an idea on how I can make it work with an AAC-encodded audio file 😉

Hi, I used the code, but not work, Could you help me? Thank you very much!

I am struggling with getting this to work. Instead of the outputNode, I want to send each channel to an induvidual AVAudioMixerNode. The type of audio unit returned from myMixer is auAudioUnit instead of audioUnit. Is there another way to get around this?


I also tried to set the auAudioUnit.channelMap to [-1,1] but audio still came out both speakers, as if the map was [0,1]

AVAudioEngine and Multiroute
 
 
Q