Basic audio synchronization

I have an iOS music game (-ish) app that requires some basic synchronization between audio playback and other non-video events.

To keep it simple, let's say I have some sound effects in files and I want to show a dot on the screen when a sound effect is played with a high degree of accuracy. Playing through the built-in speaker is fine, but users will often be using AirPods and the added latency is too noticeable.

I don't need this to be super-low latency, I just need to know what the latency is so that I can show the dot exactly when the user hears the sound.

I've mostly been experimenting in the AVAudioEngine/AVPlayerNode space. Those provide a variety of latency/ioBufferDuration properties (see below), but unfortunately they seem pretty inaccurate. Are they supposed to be?

Is there a better way to sync non-video things with audio playback?

Code Block Swift
func printLatencyInfo() {
print("audioSession.inputLatency: \(audioSession.inputLatency)")
print("audioSession.outputLatency: \(audioSession.outputLatency)")
print("audioSession.ioBufferDuration: \(audioSession.ioBufferDuration)")
print("engine.mainMixerNode.auAudioUnit.latency: \(engine.mainMixerNode.auAudioUnit.latency)")
print("engine.inputNode.auAudioUnit.latency: \(engine.inputNode.auAudioUnit.latency)")
print("engine.outputNode.auAudioUnit.latency: \(engine.outputNode.auAudioUnit.latency)")
print("engine.mainMixerNode.outputPresentationLatency: \(engine.mainMixerNode.outputPresentationLatency)")
print("engine.inputNode.outputPresentationLatency: \(engine.inputNode.outputPresentationLatency)")
print("engine.outputNode.outputPresentationLatency: \(engine.outputNode.outputPresentationLatency)")
print("engine.inputNode.auAudioUnit.latency: \(engine.inputNode.auAudioUnit.latency)")
print("engine.outputNode.auAudioUnit.latency: \(engine.outputNode.auAudioUnit.latency)")
print("playerNode.auAudioUnit.latency: \(playerNode.auAudioUnit.latency)")
}


Basic audio synchronization
 
 
Q