Select the audio device for AVAudioInputNode

I'm using AVAudioEngine for some very simple mixing application but I'd like to be able to use arbitrary audio devices for input (like USB devices) and not just the system selected device. Is there any simple way to do this?


Ideally I'd like to be able to mix multiple hardware sources at the same time but I'd be happy at this point to just be able to change the device behind the one AVAudioInputNode instance.


Thanks for any suggestions..

Replies

On iOS the AVAudioInputNode and AVAudioOutputNode provide the devices appropriate to the app's AVAudioSession category configuration for input and output. On macOS, these nodes communicate with the system's default input and output. (AVAudioIONode.h)

Thanks for the reply. I should have specified that this is for macOS.


I am aware that AVAudioInputNode is fixed to the system's default input and that's frustration behind by question. Is there any way to change the device that AVAudioInputNode is connected to? It appears to be an aggregate audio unit but by knowledge of audio units isnt strong enough to know what to do with that information.


I also tried to use AVAudioUnit with an audio unit connected to another input device but I wasnt able to make that work either.

[central] 54: ERROR: >avae> AVAudioEngineGraph.mm:1235: AddNode: required condition is false: inImpl != nil && !IsIONode(inAVNode)


Should it be possible to use AVAudioUnit in this way?

There's no API at the AVFoundation level to change the system default device but there is at the AudioObject level. AudioObjectSetPropertyData() and in AudioHardware.h look under AudioSystemObject Properties where there are properties listed for DefaultInput, SystemOutput and DefaultOutput.


Regarding your question about AVAudioUnit, I don't really know what you're asking or attempting to do. An AVAudioUnit is an object that can wrap an AudioUnit Component. It's also used as the base class for other nodes like AVAudioUnitEffect which inherits from AVAudioUnit and represents audio units of type kAudioUnitType_Effect. The error appears to be an attempt to add a node that is an I/O node. You don't add I/O nodes to the engine, you get them from properties. For example, when you first get the mainMixerNode, an output node is implicitly created for you by the engine and connected to the main mixer. To get the input node you call inputNode. You don't have to then add these nodes to the engine again.


I suggest watching the WWDC sessions from 2014/15 that discuss the AVAudioEngine and then seeing how the AVAEMixerSample works. That will give you a start.

I think the simplest way to explain what Im trying to do is with a simple example. Assume there are 2 USB mics attached to the machine. Im trying to get them both into the mixer on separate busses at the same time.


I was hoping I could instantiate the HAL audio unit with AVAudioUnit, set the device id, configure for input and use it as one would use AVAudioInputNode. Does that make sense? I would imagine that AVAudioInputNode must be doing something similar.

Best thing to do is file enhancement requests for better engine input support explaining what you're attempting to do and what capabilities you expect from the input node. AVAudioInputNode does wrap the kAudioUnitSubType_HALOutput but you cannot wrap an I/O unit yourself and add it to the engine.

Iagnat- were you able to find the answer to your question? I too am unable to choose the channel from my external audio device that is connected to avaudioengineinputnode.

I came across the same problem and found the solution on StackOverflow:

https://stackoverflow.com/questions/28781283/set-avaudioengine-input-and-output-devices

You will have to go one level deeper into Core Audio.


Translated to Swift, the code that worked for me was:


var engine = AVAudioEngine()
let inputNode: AVAudioInputNode = engine.inputNode
// get the low level input audio unit from the engine:
guard let inputUnit: AudioUnit = inputNode.audioUnit else { return }
// use core audio low level call to set the input device:
var inputDeviceID: AudioDeviceID = 219  // replace with actual, dynamic value
AudioUnitSetProperty(inputUnit, kAudioOutputUnitProperty_CurrentDevice,
                             kAudioUnitScope_Global, 0, &inputDeviceID, UInt32(MemoryLayout<AudioDeviceID>.size))
I'm having similar issue, solved it by creating an aggregate device using the following code, AudioDevice is a custom struct.
Code Block
   func createAggregateDevice(with devices: [AudioDevice]) {
    guard let firstDevice = devices.first else {
      throw MacOSAudioDevicesManagerError.unableToCreateAggregateDevice
    }
     
    let subDevicesList: [[String: Any]] = devices.map { device in
      [kAudioSubDeviceUIDKey: device.uid as CFString]
    }
     
    let desc: [String: Any] = [
      kAudioAggregateDeviceNameKey: "YourDeviceName",
      kAudioAggregateDeviceUIDKey: "YourDeviceUID",
      kAudioAggregateDeviceSubDeviceListKey: subDevicesList,
      kAudioAggregateDeviceMasterSubDeviceKey: firstDevice.uid,
      kAudioAggregateDeviceClockDeviceKey: firstDevice.uid
    ]
    var aggregateDeviceId: AudioDeviceID = 0
    let status = AudioHardwareCreateAggregateDevice(desc as CFDictionary, &aggregateDeviceId)
    guard status == 0 else {
      throw MacOSAudioDevicesManagerError.unableToCreateAggregateDevice
    }
// aggregateDeviceId will be the id of new aggregate device
  }

Then I'm trying to set default input device to this new device, and use AVAudioEngine to stream the microphone input.
Code Block   
      var deviceId = device.id
      var address = AudioObjectPropertyAddress(
        mSelector: kAudioHardwarePropertyDefaultInputDevice,
        mScope: kAudioObjectPropertyScopeGlobal,
        mElement: kAudioObjectPropertyElementMaster)
      let statusCode = AudioObjectSetPropertyData(
        AudioObjectID(kAudioObjectSystemObject),
        &address,
        0,
        nil,
        UInt32(MemoryLayout<AudioDeviceID>.size),
        &deviceId
      )