I am having this same issue. I see in the docs it says setAccessoryButtonEventsEnabled should be enabled by default. However when on a PTT channel, neither AirPods nor a bluetooth speaker response when using the play/pause actions. I can't find any example, and in the tutorials it says just enabled CoreBluetooth, but I've done that. Do you have any guidance on how to have the expected behaviour of triggering mute/unmute based on bluetooth device actions?
Post
Replies
Boosts
Views
Activity
So a bit more detail on the problem. I'm using LiveKit, and it looks like the SDK configures some parts of the audio session. Currently the bluetooth accessories work like a audio call (mute/unmute) rather than like a PPT channel (I would expect transmit/stop transmitting).
Here is the default configuration the Livekit SDK does
https://github.com/livekit/client-sdk-swift/blob/1f5959f787805a4b364f228ccfb413c1c4944748/Sources/LiveKit/Track/AudioManager.swift#L153
DispatchQueue.webRTC.async { [weak self] in
guard let self = self else { return }
// prepare config
let configuration = RTCAudioSessionConfiguration.webRTC()
var categoryOptions: AVAudioSession.CategoryOptions = []
if newState.trackState == .remoteOnly && newState.preferSpeakerOutput {
configuration.category = AVAudioSession.Category.playback.rawValue
configuration.mode = AVAudioSession.Mode.spokenAudio.rawValue
} else if [.localOnly, .localAndRemote].contains(newState.trackState) ||
(newState.trackState == .remoteOnly && !newState.preferSpeakerOutput) {
configuration.category = AVAudioSession.Category.playAndRecord.rawValue
if newState.preferSpeakerOutput {
// use .videoChat if speakerOutput is preferred
configuration.mode = AVAudioSession.Mode.videoChat.rawValue
} else {
// use .voiceChat if speakerOutput is not preferred
configuration.mode = AVAudioSession.Mode.voiceChat.rawValue
}
categoryOptions = [.allowBluetooth, .allowBluetoothA2DP]
} else {
configuration.category = AVAudioSession.Category.soloAmbient.rawValue
configuration.mode = AVAudioSession.Mode.default.rawValue
}
configuration.categoryOptions = categoryOptions
var setActive: Bool?
if newState.trackState != .none, oldState.trackState == .none {
// activate audio session when there is any local/remote audio track
setActive = true
} else if newState.trackState == .none, oldState.trackState != .none {
// deactivate audio session when there are no more local/remote audio tracks
setActive = false
}
// configure session
let session = RTCAudioSession.sharedInstance()
session.lockForConfiguration()
// always unlock
defer { session.unlockForConfiguration() }
do {
self.log("configuring audio session category: \(configuration.category), mode: \(configuration.mode), setActive: \(String(describing: setActive))")
if let setActive = setActive {
try session.setConfiguration(configuration, active: setActive)
} else {
try session.setConfiguration(configuration)
}
} catch let error {
self.log("Failed to configure audio session with error: \(error)", .error)
}
}
}
#endif
}
Its possible to override, but there is no enum element for pushToTalk. Can I get any help? This has been very hard to get to a working example of what's claimed in the documentation: "accessory button events map to begin and end transmission actions"
https://developer.apple.com/documentation/pushtotalk/ptchannelmanager/setaccessorybuttoneventsenabled%28_:channeluuid:completionhandler:%29?language=objc
Thank you