I’m trying to use AVCaptureSesion on MacOS to capture audio from a microphone and supply it to a custom function via the AVCaptureAudioDataOutputSampleBufferDelegate callback. When I run the code with my AirPods selected as the default microphone the audio input just buzzes, while the audio playing to the AirPods becomes static. I’ve tried various settings in the output’s audioSettings property including changing the AVSampleRateKey value to 16000, 44100, and 48000 and see no change in the problem.
What’s more, the problem seems to continue with the microphone even after my code exits. I’m able to get the audio output to return to normal, but the microphone continues to produce only buzzing static. It seems that the only fix is to disconnect and reconnect the AirPods.
This problem does not happen when using the built-in microphone and speaker, external microphone and headset, or a USB microphone. It only happens with AirPods and other Bluetooth devices.
Here’s some sample code I’m using to recreate the problem. If you are playing audio to AirPods on MacOS and start this code the audio will turn to static.
What’s more, the problem seems to continue with the microphone even after my code exits. I’m able to get the audio output to return to normal, but the microphone continues to produce only buzzing static. It seems that the only fix is to disconnect and reconnect the AirPods.
This problem does not happen when using the built-in microphone and speaker, external microphone and headset, or a USB microphone. It only happens with AirPods and other Bluetooth devices.
Here’s some sample code I’m using to recreate the problem. If you are playing audio to AirPods on MacOS and start this code the audio will turn to static.
Code Block language import Foundation import AVFoundation import CoreMediaIO import CoreAudio class AudioCaptureTest: NSObject, AVCaptureAudioDataOutputSampleBufferDelegate { private let capture_session = AVCaptureSession() override init() { super.init() AVCaptureDevice.requestAccess(for: .audio) { granted in if granted { print("Granted access") } } let capture_device = AVCaptureDevice.default(for: AVMediaType.audio) guard capture_device != nil else { print("Error: Can't get AVCaptureDevice") exit(1) } print("opened device \(capture_device?.localizedName ?? "unknown") at position \(capture_device?.position.rawValue ?? -1)") let device_input : AVCaptureDeviceInput? do { device_input = try AVCaptureDeviceInput(device: capture_device!) } catch { print("Error: Can't create AVCaptureDeviceInput") exit(1) } //Input capture_session.beginConfiguration() guard capture_session.canAddInput(device_input!) else { print("Error: Can't add input for AVCaptureSession") exit(1) } capture_session.addInput(device_input!) //Output let capture_output = AVCaptureAudioDataOutput() let capture_queue = DispatchQueue(label: "device_queue", attributes: []) capture_output.setSampleBufferDelegate(self, queue: capture_queue) let audioOutputSettings: [String : Any] audioOutputSettings = [ AVFormatIDKey: kAudioFormatLinearPCM, AVSampleRateKey: 44100, AVNumberOfChannelsKey: 1, AVLinearPCMIsBigEndianKey: 0, AVLinearPCMIsFloatKey: true, AVLinearPCMBitDepthKey: 32, AVLinearPCMIsNonInterleaved: true, ] capture_output.audioSettings = audioOutputSettings guard capture_session.canAddOutput(capture_output) else { print("Error: Can't add output for AVCaptureSession") exit(1) } capture_session.addOutput(capture_output) capture_session.commitConfiguration() } func start() { print("Starting capture session") capture_session.startRunning() } func stop() { print("Stopping capture session") capture_session.stopRunning() } func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection){ } } var capture_test = AudioCaptureTest() capture_test.start() signal(SIGINT, SIG_IGN) let sig_src = DispatchSource.makeSignalSource(signal: SIGINT) sig_src.setEventHandler { print("Stopping AudioCaptureTest") capture_test.stop() exit(0) } sig_src.resume() RunLoop.main.run()