Low level audio recording on macOS. Please Help!

I am struggling to see why the following low-level audio recording function - which is based on tn2091 - Device input using the HAL Output Audio Unit - (a great article, btw, although a bit dated, and it would be wonderful if it was updated to use Swift and non deprecated stuff at some point!) fails to work under macOS:

func createMicUnit() -> AUAudioUnit {
    let compDesc = AudioComponentDescription(
        componentType: kAudioUnitType_Output,
        componentSubType: kAudioUnitSubType_HALOutput, // I am on macOS, os this is good
        componentManufacturer: kAudioUnitManufacturer_Apple,
        componentFlags: 0, componentFlagsMask: 0)
    return try! AUAudioUnit(componentDescription: compDesc, options: [])
}

func startMic() {
    // mic permision is already granted at this point, but let's check
    let status = AVCaptureDevice.authorizationStatus(for: AVMediaType.audio)
    precondition(status == .authorized)     // yes, all good
    let unit = createMicUnit()
    unit.isInputEnabled = true
    unit.isOutputEnabled = false
    precondition(!unit.canPerformInput)     // can't record yet, and know why?
    print(deviceName(unit.deviceID))        // "MacBook Pro Speakers" - this is why
    let micDeviceID = defaultInputDeviceID
    print(deviceName(micDeviceID))          // "MacBook Pro Microphone" - this is better
    try! unit.setDeviceID(micDeviceID)      // let's switch device to mic
    precondition(unit.canPerformInput)      // now we can record
    print("\(String(describing: unit.channelMap))") // channel map is "nil" by default
    unit.channelMap = [0]                   // not sure if this helps or not
    let sampleRate = deviceActualFrameRate(micDeviceID)
    print(sampleRate)                       // 48000.0
    let format = AVAudioFormat(
        commonFormat: .pcmFormatFloat32, sampleRate: sampleRate,
        channels: 1, interleaved: false)!
    try! unit.outputBusses[1].setFormat(format)

    unit.inputHandler = { flags, timeStamp, frameCount, bus in
        fatalError("never gets here")       // now the weird part - this is never called!
    }
    try! unit.allocateRenderResources()
    try! unit.startHardware()               // let's go!
    print("mic should be working now... why it doesn't?")
    // from now on the (UI) app continues its normal run loop
}

All sanity checks pass with flying colors but unit's inputHandler is not being called. Any idea why?

Thank you!

Looks like Apple audio bug to me.

The first version uses older API and it works ok (in that it calls the callback):

func startMic1(_ mic: AudioDeviceID) {
    var desc = AudioComponentDescription(componentType: kAudioUnitType_Output, componentSubType: kAudioUnitSubType_HALOutput,
        componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0)
    let comp = AudioComponentFindNext(nil, &desc)
    var unit: AudioUnit!
    AudioComponentInstanceNew(comp!, &unit)
    var enable: UInt32 = 1
    AudioUnitSetProperty(unit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &enable, 4)
    var disable: UInt32 = 0
    AudioUnitSetProperty(unit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &disable, 4)
    var device = mic
    AudioUnitSetProperty(unit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &device, 8)
    var cb = AURenderCallbackStruct(inputProc: { _, _, _, _, _, _ in
        fatalError("callback")
    }, inputProcRefCon: nil)
    AudioUnitSetProperty(unit, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, 1, &cb, 16)
    AudioUnitInitialize(unit)
    AudioOutputUnitStart(unit)
}

The second version utilises a newer API and doesn't work (doesn't call the callback):

func startMic2(_ mic: AudioDeviceID) {
    let desc = AudioComponentDescription(componentType: kAudioUnitType_Output, componentSubType: kAudioUnitSubType_HALOutput,
        componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0)
    let unit = try! AUAudioUnit(componentDescription: desc, options: [])
    unit.isInputEnabled = true
    unit.isOutputEnabled = false
    try! unit.setDeviceID(mic)
    unit.inputHandler = { _, _, _, _ in
        fatalError("callback")
    }
    try! unit.allocateRenderResources()
    try! unit.startHardware()
}

Error handling removed for brevity.

Same issue under iOS.

Older API works (calls the callback):

func startMic1() {
    let session = AVAudioSession.sharedInstance()
    try! session.setCategory(.playAndRecord, mode: .default)
    try! session.setPreferredSampleRate(44100)
    try! session.setPreferredInputNumberOfChannels(1)
    try! session.setPreferredIOBufferDuration(0.1)
    try! session.setActive(true)
    
    var desc = AudioComponentDescription(componentType: kAudioUnitType_Output,
        componentSubType: kAudioUnitSubType_RemoteIO,
        componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0)
    let comp = AudioComponentFindNext(nil, &desc)
    var unit: AudioUnit!
    var err = AudioComponentInstanceNew(comp!, &unit)
    precondition(err == noErr)
    var enable: UInt32 = 1
    err = AudioUnitSetProperty(unit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &enable, 4)
    precondition(err == noErr)
    var disable: UInt32 = 0
    err = AudioUnitSetProperty(unit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &disable, 4)
    precondition(err == noErr)
    var cb = AURenderCallbackStruct(inputProc: { _, _, _, _, _, _ in
        fatalError("callback")
    }, inputProcRefCon: nil)
    err = AudioUnitSetProperty(unit, kAudioOutputUnitProperty_SetInputCallback, kAudioUnitScope_Global, 1, &cb, 16)
    precondition(err == noErr)
    err = AudioUnitInitialize(unit)
    precondition(err == noErr)
    err = AudioOutputUnitStart(unit)
    precondition(err == noErr)
    // from now on the (UI) app continues its normal run loop
}

Newer API doesn't work (callback is not called):

func startMic2() {
    let session = AVAudioSession.sharedInstance()
    try! session.setCategory(.playAndRecord, mode: .default)
    try! session.setPreferredSampleRate(44100)
    try! session.setPreferredInputNumberOfChannels(1)
    try! session.setPreferredIOBufferDuration(0.1)
    try! session.setActive(true)
    
    let compDesc = AudioComponentDescription(componentType: kAudioUnitType_Output,
        componentSubType: kAudioUnitSubType_RemoteIO,
        componentManufacturer: kAudioUnitManufacturer_Apple,
        componentFlags: 0, componentFlagsMask: 0)
    let unit = try! AUAudioUnit(componentDescription: compDesc, options: [])
    
    unit.isInputEnabled = true
    unit.isOutputEnabled = false
    precondition(unit.canPerformInput)
    unit.inputHandler = { flags, timeStamp, frameCount, bus in
        fatalError("never gets here")       // now the weird part - this is never called!
    }
    try! unit.allocateRenderResources()
    try! unit.startHardware()               // let's go!
    print("mic should be working now... why it doesn't?")
    // from now on the (UI) app continues its normal run loop
}

Hahaha, found a stupid mistake above, closing the issue.

Low level audio recording on macOS. Please Help!
 
 
Q