iOS10 and AudioSession MultiRoute category

Hello,

I have some trouble using the multiRoute category in iOS10, while it was working well in IOS9. I couldnt find exactly what the problem was, but i will show you a list of symptoms. (For my tests, i used an iPad Air1 and an iPad Pro, both on IOS10. All the issues described below are not in iOS9).


- Plugging and unplugging headphones and/or a USB sound card to my iPad MAY (it's quite random actually) switch the AudioSession category to AVAudioSessionCategorySoloAmbient (while I was in multiRoute).


- If i'm not yet in AVAudioSessionCategorySoloAmbient : when I plug a USB sound card AND headphones to my iPad, the property outputNumberofChannels does not count all the channels available (my guess is that it doesnt count the headphones channels), but the maximumNumberOfChannels is the actual number of channels (headphones + USB sound card). Thus I can't set the channelMap property correctly, and I can't play any sound on the headphones. However, the current route information shows both outputPort (USB and headphones), with the right number of channels.


-Only on iPad Pro : I can't have sound playing on iPad native speakers while in multiRoute. At all.



Below is the sampleCode I used to get this problems. Using the same sample code with iOS9 works just fine.

Thank you !


Fesongs


SAMPLE CODE

#import "ViewController.h"
@import AudioToolbox;
@import Accelerate;
@import AVFoundation;

typedef struct soundStructArrayType {
    AudioUnit                           ioUnit;
    AudioStreamBasicDescription         hwsf;
    int                                 index;
}soundStructArrayType;

@interface ViewController ()
     @property (nonatomic, assign)   soundStructArrayType    soundStructArray;
     @property (nonatomic, assign)   AudioUnit               ioUnit;
     @property (nonatomic, strong)   NSString*               category;

     //Display AudioSession information on iPad
     @property (weak, nonatomic)     IBOutlet UITextView     *logTextView;
     @property (nonatomic, strong)   NSMutableString         *logHistory;
     @property (nonatomic, assign)   BOOL                    logNeedupdate;
     @property (nonatomic, strong)   NSTimer                 *logTimer;
@end

@implementation ViewController
static OSStatus   renderCallback (
                                  void                        *inRefCon,
                                  AudioUnitRenderActionFlags  *ioActionFlags,
                                  const AudioTimeStamp        *inTimeStamp,
                                  UInt32                      inBusNumber,
                                  UInt32                      inNumberFrames,
                                  AudioBufferList             *ioData
                                  )
{
    soundStructArrayType* context = inRefCon;
    AudioUnitRender(context->ioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);
    memcpy(ioData->mBuffers[1].mData, ioData->mBuffers[0].mData, ioData->mBuffers[0].mDataByteSize);
    return noErr;
}

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    Float32 sampleRate = 44100.;
    //StreamFormat
    UInt32 bytesPerSample = sizeof(float);
    AudioStreamBasicDescription hardwareFormat = {0};
    hardwareFormat.mFormatID = kAudioFormatLinearPCM;
    hardwareFormat.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked | kAudioFormatFlagIsNonInterleaved | kAudioFormatFlagsNativeEndian;
    hardwareFormat.mBitsPerChannel = 8 * bytesPerSample;
    hardwareFormat.mFramesPerPacket = 1;
    hardwareFormat.mChannelsPerFrame = 2;
    hardwareFormat.mBytesPerPacket = bytesPerSample;
    hardwareFormat.mBytesPerFrame = bytesPerSample;
    hardwareFormat.mSampleRate = sampleRate;

    NSError *error = nil;
    OSStatus result = noErr;

    //Obtain a reference to the singleton audio session object for your application.
    AVAudioSession* mySession = [AVAudioSession sharedInstance];
    //Request a hardware sample rate. The system may or may not be able to grant the request, depending on other audio activity on the device.
    [mySession setPreferredSampleRate:sampleRate error:&error];
    //Request the audio session category you want. The "play and record category, specified here, supports audio input and output.
    self.category = AVAudioSessionCategoryMultiRoute;
    [mySession setCategory:self.category error:&error];
    //Request activation of your audio session.
    [mySession setActive:YES error:&error];

    Float32 ioBufferDuration = 0.005;
    _soundStructArray.index = 0;
    [mySession setPreferredIOBufferDuration:ioBufferDuration error:&error];

    //Specify the AudioUnits you want
    AudioComponentDescription ioUnitDescription;
    ioUnitDescription.componentType = kAudioUnitType_Output;
    ioUnitDescription.componentSubType = kAudioUnitSubType_RemoteIO;
    ioUnitDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
    ioUnitDescription.componentFlags = 0;
    ioUnitDescription.componentFlagsMask = 0;

    //Building an Audio Processing
    AUGraph processingGraph;
    NewAUGraph(&processingGraph);
    AUNode ioNode;
    AUGraphAddNode(processingGraph, &ioUnitDescription, &ioNode);

    //Open Graph to initiate the audio Units
    AUGraphOpen(processingGraph);
    AUGraphNodeInfo(processingGraph, ioNode, NULL, &(_ioUnit));

    //Now, ioUnit variables hold reference to the audio unit instances in the graph
    //Configure the AudioUnits. No need to configure the output, it is enable by defaut.
    UInt32 one = 1;
    UInt32 maxFrPrSl = 2048;
    AudioUnitSetProperty(_ioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &one, sizeof(one));
    AudioUnitSetProperty(_ioUnit, kAudioUnitProperty_MaximumFramesPerSlice, kAudioUnitScope_Global, 0, &maxFrPrSl, sizeof(maxFrPrSl));

    //Attaching a render callback in a thread-safe manner
    AURenderCallbackStruct callbackStruct;
    _soundStructArray.ioUnit = _ioUnit;
    callbackStruct.inputProc    = &renderCallback;
    callbackStruct.inputProcRefCon = &_soundStructArray;
    AUGraphSetNodeInputCallback(processingGraph, ioNode, 0, &callbackStruct);

    result = AudioUnitSetProperty(_ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 0, &hardwareFormat, sizeof(hardwareFormat));
    AudioUnitSetProperty(_ioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, 1, &hardwareFormat, sizeof(hardwareFormat));
    AUGraphInitialize(processingGraph);
    _soundStructArray.hwsf = hardwareFormat;
    AUGraphStart(processingGraph);
    CAShow(processingGraph);

    self.logHistory = [[NSMutableString alloc] init];
    self.logTimer = [NSTimer scheduledTimerWithTimeInterval:0.05 target:self selector:@selector(updateLog) userInfo:nil repeats:YES];

    [self registerForNotifications:mySession];
    [self displayAudioSessionInformations];
    [self multirouteAudioMappingOnAllAvailableOutput];
}

- (void)registerForNotifications:(AVAudioSession*)sessionInstance {
    NSNotificationCenter* notificationCenter = [NSNotificationCenter defaultCenter];
    [notificationCenter addObserver:self selector:@selector(handleRouteChange:) name:AVAudioSessionRouteChangeNotification object:sessionInstance];
}

- (void)handleRouteChange:(NSNotification*)notification {
    AVAudioSession *sessionInstance = [AVAudioSession sharedInstance];
    NSString *message;
    // If category changed
    if (self.category != [sessionInstance category]) {
        message = [NSString stringWithFormat:@"AUDIO SESSION CATEGORY CHANGED. New AudioSession Category : %@", [sessionInstance category]];
        [self audioSessionLog:message];
        self.category = [sessionInstance category];
    }
    [self multirouteAudioMappingOnAllAvailableOutput];
    [self displayAudioSessionInformations];
}

- (void)multirouteAudioMappingOnAllAvailableOutput {
    AVAudioSession *sessionInstance = [AVAudioSession sharedInstance];
    UInt32 outputNbChannels = (UInt32)sessionInstance.outputNumberOfChannels;
    if (outputNbChannels != 0) {
        UInt32* outputChannelMap = (UInt32*)calloc(outputNbChannels, sizeof(UInt32));

        //Mapping input on all outputs available.
        for (int i = 0; i<outputNbChannels; ++i) {
            outputChannelMap[i] = i%2;
        }
        OSStatus result = AudioUnitSetProperty(_ioUnit, kAudioOutputUnitProperty_ChannelMap, kAudioUnitScope_Output, 0, outputChannelMap, outputNbChannels*sizeof(UInt32));
        if (result) [self audioSessionLog:@"Error in setting output channel map"];
    }
}

- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}


/////////////////// Display information
- (void)updateLog {
    if (!self.logNeedupdate) return;
    dispatch_async(dispatch_get_main_queue(), ^{
        self.logTextView.scrollEnabled = NO;
        self.logTextView.text = self.logHistory;
        self.logTextView.scrollEnabled = YES;
        [self.logTextView scrollRangeToVisible:NSMakeRange(self.logTextView.text.length-1, 1)];
        self.logNeedupdate = NO;
    });
}
- (void)audioSessionLog:(NSString*)message {
    dispatch_async(dispatch_get_main_queue(), ^{
        [self.logHistory appendString:@"\n"];
        [self.logHistory appendString:message];
        self.logNeedupdate = YES;
    });
}
- (void) displayAudioSessionInformations {
    AVAudioSession *sessionInstance = [AVAudioSession sharedInstance];
    NSString *message;
    message = @"\n*************************************************************************************************************************************************************";
    [self audioSessionLog:message];
    message = @"Audio Session Information :\n";
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"AudioSession category : %@", [sessionInstance category]];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"AudioSession mode : %@", [sessionInstance mode]];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"AudioSession preffered sample rate : %f, sample rate : %f", sessionInstance.preferredSampleRate, sessionInstance.sampleRate];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"OutputVolume : %f, InputGain : %f, isInputGainSettable : %d", [sessionInstance outputVolume], [sessionInstance inputGain], (int)[sessionInstance isInputGainSettable]];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"IOBufferDutation : %f, InputLatency : %f, OutputLatency : %f", [sessionInstance IOBufferDuration], [sessionInstance inputLatency], [sessionInstance outputLatency]];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"Num Channels Input : %d, max : %d, prefered : %d", (int)[sessionInstance inputNumberOfChannels], (int)[sessionInstance maximumInputNumberOfChannels], (int)[sessionInstance preferredInputNumberOfChannels]];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"Num Channels Output : %d, max : %d, prefered : %d", (int)[sessionInstance outputNumberOfChannels], (int)[sessionInstance maximumOutputNumberOfChannels], (int)[sessionInstance preferredOutputNumberOfChannels]];
    [self audioSessionLog:message];
    message = @" ";
    [self audioSessionLog:message];
    NSArray *inputsArray        = [sessionInstance availableInputs];
    NSArray *inputSourcesArray  = [sessionInstance inputDataSources];
    NSArray *outputSourcesArray = [sessionInstance outputDataSources];
    message = [NSString stringWithFormat:@"Is input available? : %d", [sessionInstance isInputAvailable]];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"***** Number of input ports : %lu", (unsigned long)inputsArray.count];
    [self audioSessionLog:message];
    for (NSUInteger i=0; i<inputsArray.count; i++) {
        AVAudioSessionPortDescription *portDesc = [inputsArray objectAtIndex:i];
        message = [NSString stringWithFormat:@"     INPUT PORT %lu", (unsigned long)i];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          portName : %@", portDesc.portName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          portType : %@", portDesc.portType];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          UID : %@", portDesc.UID];
        [self audioSessionLog:message];
        NSArray *channelsArray = portDesc.channels;
        message = [NSString stringWithFormat:@"          ***** Number of port channels : %lu", (unsigned long)channelsArray.count];
        [self audioSessionLog:message];
        for (NSUInteger j=0; j<channelsArray.count; j++) {
            AVAudioSessionChannelDescription *channel = [channelsArray objectAtIndex:j];
            message = [NSString stringWithFormat:@"               CHANNEL %lu", (unsigned long)j];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"               channelName : %@", channel.channelName];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"               channelNumber : %lu", (unsigned long)channel.channelNumber];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"               owningPortUID : %@", channel.owningPortUID];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"               channelLabel : %d", (unsigned int)channel.channelLabel];
            [self audioSessionLog:message];
        }
        NSArray *portSources = portDesc.dataSources;
        message = [NSString stringWithFormat:@"          ***** Number of port sources : %lu", (unsigned long)portSources.count];
        [self audioSessionLog:message];
        for (NSUInteger j=0; j<portSources.count; j++) {
            AVAudioSessionDataSourceDescription *source = [portSources objectAtIndex:j];
            message = [NSString stringWithFormat:@"               PORT DATASOURCE %lu", (unsigned long)j];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    dataSourceID : %@", source.dataSourceID];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    dataSourceName : %@", source.dataSourceName];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    location : %@", source.location];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    orientation : %@", source.orientation];
            [self audioSessionLog:message];
        }
    }
    message = @" ";
    [self audioSessionLog:message];
    AVAudioSessionPortDescription *portDesc = [sessionInstance preferredInput];
    message = @"***** Prefered input port";
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     portName : %@", portDesc.portName];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     portType : %@", portDesc.portType];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     UID : %@", portDesc.UID];
    [self audioSessionLog:message];
    message = @" ";
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"***** Number of input datasources : %lu", (unsigned long)inputSourcesArray.count];
    [self audioSessionLog:message];
    for (NSUInteger i=0; i<inputSourcesArray.count; i++) {
        AVAudioSessionDataSourceDescription *source = [inputSourcesArray objectAtIndex:i];
        message = [NSString stringWithFormat:@"     INPUT DATASOURCE %lu", (unsigned long)i];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          dataSourceID : %@", source.dataSourceID];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          dataSourceName : %@", source.dataSourceName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          location : %@", source.location];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          orientation : %@", source.orientation];
        [self audioSessionLog:message];
    }
    AVAudioSessionDataSourceDescription *inputSource = [sessionInstance inputDataSource];
    message = @"***** Selected Input Datasource";
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     dataSourceID : %@", inputSource.dataSourceID];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     dataSourceName : %@", inputSource.dataSourceName];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     location : %@", inputSource.location];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     orientation : %@", inputSource.orientation];
    [self audioSessionLog:message];
    message = @" ";
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"***** Number of output sources : %lu", (unsigned long)outputSourcesArray.count];
    [self audioSessionLog:message];
    for (NSUInteger i=0; i<outputSourcesArray.count; i++) {
        AVAudioSessionDataSourceDescription *source = [outputSourcesArray objectAtIndex:i];
        message = [NSString stringWithFormat:@"     DATASOURCE %lu", (unsigned long)i];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"     dataSourceID : %@", source.dataSourceID];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"     dataSourceName : %@", source.dataSourceName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"     location : %@", source.location];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"     orientation : %@", source.orientation];
        [self audioSessionLog:message];
    }
    AVAudioSessionDataSourceDescription *outputSource = [sessionInstance outputDataSource];
    message = @"***** Selected Output Datasource";
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     dataSourceID : %@", outputSource.dataSourceID];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     dataSourceName : %@", outputSource.dataSourceName];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     location : %@", outputSource.location];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"     orientation : %@", outputSource.orientation];
    [self audioSessionLog:message];
    message = @"\n";
    [self audioSessionLog:message];
    AVAudioSessionRouteDescription *route = [sessionInstance currentRoute];
    NSArray *currentInputsArray  = route.inputs;
    NSArray *currentOutputsArray = route.outputs;
    message = [NSString stringWithFormat:@"********** CURRENT ROUTE **********"];
    [self audioSessionLog:message];
    message = [NSString stringWithFormat:@"***** Number of input ports : %lu", (unsigned long)currentInputsArray.count];
    [self audioSessionLog:message];
    for (NSUInteger i=0; i<currentInputsArray.count; i++) {
        AVAudioSessionPortDescription *portDesc = [currentInputsArray objectAtIndex:i];
        message = [NSString stringWithFormat:@"     INPUT PORT %lu", (unsigned long)i];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          portName : %@", portDesc.portName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          portType : %@", portDesc.portType];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          UID : %@", portDesc.UID];
        [self audioSessionLog:message];
        NSArray *channelsArray = portDesc.channels;
        message = [NSString stringWithFormat:@"          ***** Number of port channels : %lu", (unsigned long)channelsArray.count];
        [self audioSessionLog:message];
        for (NSUInteger j=0; j<channelsArray.count; j++) {
            AVAudioSessionChannelDescription *channel = [channelsArray objectAtIndex:j];
            message = [NSString stringWithFormat:@"               CHANNEL %lu", (unsigned long)j];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    channelName : %@", channel.channelName];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    channelNumber : %lu", (unsigned long)channel.channelNumber];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    owningPortUID : %@", channel.owningPortUID];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    channelLabel : %d", (int)channel.channelLabel];
            [self audioSessionLog:message];
        }
        NSArray *portSources = portDesc.dataSources;
        message = [NSString stringWithFormat:@"          ***** Number of port sources : %lu", (unsigned long)portSources.count];
        [self audioSessionLog:message];
        for (NSUInteger j=0; j<portSources.count; j++) {
            AVAudioSessionDataSourceDescription *source = [portSources objectAtIndex:j];
            message = [NSString stringWithFormat:@"               PORT DATASOURCE %lu", (unsigned long)j];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    dataSourceID : %@", source.dataSourceID];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    dataSourceName : %@", source.dataSourceName];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    location : %@", source.location];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    orientation : %@", source.orientation];
            [self audioSessionLog:message];
        }
        AVAudioSessionDataSourceDescription *source = [portDesc selectedDataSource];
        message = [NSString stringWithFormat:@"          ***** PORT SELECTED DATASOURCE"];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               dataSourceID : %@", source.dataSourceID];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               dataSourceName : %@", source.dataSourceName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               location : %@", source.location];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               orientation : %@", source.orientation];
        [self audioSessionLog:message];
    }
    message = [NSString stringWithFormat:@"***** Number of output ports : %lu", (unsigned long)currentOutputsArray.count];
    [self audioSessionLog:message];
    for (NSUInteger i=0; i<currentOutputsArray.count; i++) {
        AVAudioSessionPortDescription *portDesc = [currentOutputsArray objectAtIndex:i];
        message = [NSString stringWithFormat:@"     OUTPUT PORT %lu", (unsigned long)i];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          portName : %@", portDesc.portName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          portType : %@", portDesc.portType];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"          UID : %@", portDesc.UID];
        [self audioSessionLog:message];
        NSArray *channelsArray = portDesc.channels;
        message = [NSString stringWithFormat:@"          ***** Number of port channels : %lu", (unsigned long)channelsArray.count];
        [self audioSessionLog:message];
        for (NSUInteger j=0; j<channelsArray.count; j++) {
            AVAudioSessionChannelDescription *channel = [channelsArray objectAtIndex:j];
            message = [NSString stringWithFormat:@"               CHANNEL %lu", (unsigned long)j];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    channelName : %@", channel.channelName];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    channelNumber : %lu", (unsigned long)channel.channelNumber];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    owningPortUID : %@", channel.owningPortUID];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    channelLabel : %d", (unsigned int)channel.channelLabel];
            [self audioSessionLog:message];
        }
        NSArray *portSources = portDesc.dataSources;
        message = [NSString stringWithFormat:@"          ***** Number of port sources : %lu", (unsigned long)portSources.count];
        [self audioSessionLog:message];
        for (NSUInteger j=0; j<portSources.count; j++) {
            AVAudioSessionDataSourceDescription *source = [portSources objectAtIndex:j];
            message = [NSString stringWithFormat:@"               PORT DATASOURCE %lu", (unsigned long)j];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    dataSourceID : %@", source.dataSourceID];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    dataSourceName : %@", source.dataSourceName];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    location : %@", source.location];
            [self audioSessionLog:message];
            message = [NSString stringWithFormat:@"                    orientation : %@", source.orientation];
            [self audioSessionLog:message];
        }
        AVAudioSessionDataSourceDescription *source = [portDesc selectedDataSource];
        message = [NSString stringWithFormat:@"          ***** PORT SELECTED DATASOURCE"];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               dataSourceID : %@", source.dataSourceID];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               dataSourceName : %@", source.dataSourceName];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               location : %@", source.location];
        [self audioSessionLog:message];
        message = [NSString stringWithFormat:@"               orientation : %@", source.orientation];
        [self audioSessionLog:message];
    }
    message = [NSString stringWithFormat:@"********** END CURRENT ROUTE **********"];
    [self audioSessionLog:message];
    message = @"\n*************************************************************************************************************************************************************\n";
    [self audioSessionLog:message];
}

Replies

If the audio session category is changing out from under you, a first guess would be that media services are being reset. Are you listening for the AVAudioSessionMediaServicesWereResetNotification notification? Would be interesting to know if that were the case.


Also, noticed you're using the AVAudioSessionCategoryOptionAllowBluetooth option. When choosing MultiRoute, this option will be ignored as it only applies to AVAudioSessionCategoryPlayAndRecord or AVAudioSessionCategoryRecord. This shoouldn't cause any problems with what you're doing since it is being ignored (there are no HFP BT routes in MuliRoute), but just something to note.

Hello analogkid,


Thanks for the answer.

I'm working with Fesongs on this problem, we actually have huges issues with iOS10 for all our apps using Multiroute. The AllowBluetooth option is an error of this sample code, it's not present for MultiRoute in our codes (it has been corrected here).


Basically:

1. We can't access headphones once a USB audio device has been plugged in, which is the main feature of the MultiRoute category. As Fesongs said, the maximumOutputNumberOfChannels property is fine (equals the number of outputs of the USB device + 2), but we never managed to set the output number of channels to this maximum. The outputNumberOfChannels get stuck at the number of outputs of the USB device.

2. On iPad Pro under iOS10, when you set the audiosession's category to MultiRoute, no sound comes out from the internal speaker, under any circumstances. If you compile the code above on an iPad Pro under iOS10, you won't have any sound without headphones or USB audio device. If you compile it on another iPad, the internal speaker is working. Careful though, as we directly plugged input to output on this code, you'll get a Larsen effect between the internal micorphones and speaker saying you that it's working when you don't plug headphones 😉


NB : All this were working perfectly fine with iOS 8 & 9 (except for the Multiroute category which was not working on iOS 9.0 with iPad Pro).


Thanks for your help !

Actually, we never managed to use the method setOutputNumberOfChannels (both iOS9 and iOS10) in Multiroute, while in other categories, it worked okay. But we didn't really had to. However, today the outputNumberOfChannels differs from maxOutputNumberOfChannel, and we can't set it as we want.


Here is a simple code showing how I use it.

    NSError *error = nil;
    AVAudioSession *sessionInstance = [AVAudioSession sharedInstance];
    self.category = AVAudioSessionCategoryMultiRoute;
    [sessionInstance setCategory:self.category error:&error];
    [sessionInstance setMode:AVAudioSessionModeDefault error:&error];
    [sessionInstance setPreferredIOBufferDuration:.005 error:&error];
    [sessionInstance setPreferredSampleRate:44100. error:&error];
    [sessionInstance setActive:YES error:&error];

    [sessionInstance setPreferredOutputNumberOfChannels:1 error:&error];
    if (error) {
        NSLog(@"Couldn't set preferredNumberOfChannels. %@", error);
    } else {
        NSLog(@"Preferred output number of channels : %li", [sessionInstance preferredOutputNumberOfChannels]);
    }


In MultiRoute category, I have the following outuput :

2016-09-15 17:06:46.048511 testAudioSetupV3[1865:491198] Couldn't set preferredNumberOfChannels. Error Domain=NSOSStatusErrorDomain Code=-50 "(null)"


In PlayAndRecord or SoloAmbient or Playback, I have :

2016-09-15 17:09:30.420319 testAudioSetupV3[1869:491841] Preferred output number of channels : 1


Even if i try to setPreferredOutputNumberOfChannels to the actual current number of channel [mySession outputNumberOfChannels], I get an error in MultiRoute.


Concerning Media Services, I tried to observe AVAudioSessionMediaServicesWereResetNotification as you suggest, and effectively the category change occurs when media services are reset. But I think, regarding the issues we encountered with the outputNumberOfChannels, that the problem is at another level.


Thanks again for your help

Fesongs

Hi thomas, can you file each of these issue 1 & 2 as separate bugs so we can take a look. Please include the test app you have as something we can build and run. For the USB issue, if this happens with any USB device that's great, if it has to be a specific device or some specific hardware configuration please include that information as well in the bug.


Regarding the second point - it should be normal that the internal speaker not be available if there are other outputs available, but what you're reporting is no audio on one device with no other outputs available and audio on other devices when there are other outputs available, so completely not what we would expect. The "no audio" issue would be consider high priority so please file bugs as soon as you can.


Finally, if you are seeing media services being reset, file that as a separate issue. The problem should generate some logging information, logs + the code that causes the resets would be great. Again, these reset issues would be considered high priority so we'd like to take a look right away.


Post the bug ID's in this thread and I'll make sure they are routed correctly. thanks!

Hello,


Thank you for your answer.


Yes, the USB issues happen with all the USB audio devices we have.


I know that the internal speaker is working only if there is no other audio output available. The problem here is that the internal speaker doesn't work AT ALL with multiroute audio session under iOS 10 on numerous devices, although it works fine under iOS 8 and 9.


I just filed 3 bug reports for all the problems we have with iOS 10: 28361060, 28363097 et 28363203.


Plus, I filed another bug report for another audio-related problem: 28363333.


Just so you know, we have several commercial musical apps working with multiroute audio session (in order to support the use of a USB sound card and headphones at the same time), and we received tons a mails from our customers because their app suddenly stopped making sound from internal speaker after the upgrade to iOS 10. We had to temporarly downgrade some functionnalities to support iOS 10, this is kind of a critical situation.


Thank you for routing all these bug reports so that they can be treated as fast as possible.

Thanks thomas, already commented and routed earlier today. I'm tracking the issues so if there's any other information we need, or if there's any workarounds that you can implement sooner than later I can let you know.

I'm seeing something a little different. I'm running on an iPad Air 2 with iOS 10.0.1 and compiling with the iOS 10 SDK. For me, sound comes from the internal speaker if I specifically route it to the speaker port. It used to come out of the speaker if the session was MultiRoute but no channel map was assigned. From the user's perspective, this means the app used to automatically use the internal speakers if a USB interface was not connected, but now users have to explicitly select the internal speakers in the app settings.


By the way, my app has an option to use either AVAudioEngine for AVAudioPlayer, and this problem occurs with both.

Actually this seems inconsistent now. Sound sometimes comes out the speakers with multi-route on, but I haven't been able to find a setup or sequence of events that makes it consistently work or not work. Hopefully Apple's engineers will be able to see what the problem is.

I have a similar problem using AVAudioSessionCategoryMultiRoute on iOS 10. Works perfectly on iOS 9 and 8, but no sound can be heard playing out of the spaker on iOS 10. rdar://28408849

Hi thomas,


Some news for you regarding the MultiRoute issues:

  • We did identify a bug in the implementation of outputNumberOfChannels in the context of MultiRoute. If the applications category is set to MultiRoute and the session is Active the expectation is that outputNumberOfChannels and maximumOutputNumberOfChannels should be the same value and this is not the case. Modifying your code to use the maximumOutputNumberOfChannels as a workaround should get you going in the interim until outputNumberOfChannels is fixed to correctly report its value in MultiRoute.
  • We have been able to reproduce the media server crashing issue quite easily and are investigating further for a future fix. Our recommendation is that you adopt the use of the AVAudioSessionMediaServicesWereResetNotification as discussed in Q&A1749. Not a complete solution, but your app should respond to this notification anyway and it will provide a recovery mechanism.
  • We have also been able to reproduce the "no sound on internal speaker" issue. But this one seems to be in the plumbing and is also being investigated further for a future fix.


thanks!

It sounds like Apple have been able to reproduce these bugs and are (hopefully) on the case, but I'll add my experience:


Hardware: iPhone 5

OS: 10.0.2

Audio Session Type: AVAudioSessionCategoryMultiRoute


Expected: When no headphones or external audio devices are attached, audio should output via internal speaker (as in iOS 8 and 9).

Actual: No audio from internal speaker.


I tried AVAudioSessionCategoryOptionDefaultToSpeaker but that makes no difference, as expected (docs say it's only for AVAudioSessionCategoryPlayAndRecord).


Internal speakers work fine if audio category is AVAudioSessionCategoryPlayAndRecord.

I have the same issue here.

I recently changed my App to use maximumOutputNumberOfChannels instead of outputNumberOfChannels.

In Category Multiroute depending on the USB - Audio Device either only the Phones are working or the USB Device is working.

But never both together.

Dear Apple Staff,


I just checked if there was anything new abouth these bugs on iOS 10.1.1. I noticed that :

- The "no sound on internal speaker" bug seems to have been solved (not sure on which version though, maybe a previous one).

- The other Multiroute bugs are still present, especially the impossibility to use headphones jack and USB audio at the same time, wihch is basically to one and only interest of Multiroute.


Any chance we'll get back a functionning Multiroute anytime soon?


Best


Thomas

Correction : the "no sound on internal speaker" bug is not solved on every device. Still present at least on iPhone 6.

iOS 10.2 beta 3? I see your bugs for "no sound on internal speaker" and the "USB + headphone" issue were being held for the beta seed that we just released today. I assume they will be sent back to you to verify shortly.


Testing with the latest seed and updating the bugs accordingly would be helpful.