Posts

Post marked as solved
1 Replies
1.5k Views
I'm working on an app that uses CoreHaptics to play a synchronised pattern of vibrations and audio. The problem is that the audio only gets played through the iPhones speakers (if the mute switch is not turned on). As soon as I connect my AirPods to the phone the audio stops playing, but the haptics continue. My code looks something like this: let engine = CHHapticEngine() ... var events = [CHHapticEvent]() ... let volume: Float = 1 let decay: Float = 0.5 let sustained: Float = 0.5 let audioParameters = [ CHHapticEventParameter(parameterID: .audioVolume, value: volume), CHHapticEventParameter(parameterID: .decayTime, value: decay), CHHapticEventParameter(parameterID: .sustained, value: sustained) ] let breathingTimes = pacer.breathingTimeInSeconds let combinedTimes = breathingTimes.inhale + breathingTimes.exhale let audioEvent = CHHapticEvent( audioResourceID: selectedAudio, parameters: audioParameters, relativeTime: 0, duration: combinedTimes ) events.append(audioEvent) ... let pattern = try CHHapticPattern(events: events, parameterCurves: []) let player = try engine.makeAdvancedPlayer(with: pattern) ... try player.start(atTime: CHHapticTimeImmediate) My idea to activate an audio session before the player starts, to indicate to the system that audio is played, also didn't changed the outcome: try AVAudioSession.sharedInstance().setActive(true) Is there a different way to route the audio from CoreHaptics to a different output other than the integrated speakers?
Posted Last updated
.