Help Needed: How to Make iOS Timer More Stable?

I’m currently developing an iOS metronome app using DispatchSourceTimer as the timer. The interval is set very small, around 50 milliseconds, and I’m using CFAbsoluteTimeGetCurrent to calculate the elapsed time to ensure the beat is played within a ±0.003-second margin.

The problem is that once the app goes to the background, the timing becomes unstable—it slows down noticeably, then recovers after 1–2 seconds.

When coming back to the foreground, it suddenly speeds up, and again, it takes 1–2 seconds to return to normal. It feels like the app is randomly “powering off” and then “overclocking.” It’s super frustrating.

I’ve noticed that some metronome apps in the App Store have similar issues, but there’s one called “Professional Metronome” that’s rock solid with no such problems. What kind of magic are they using? Any experts out there who can help? Thanks in advance!

P.S. I’ve already enabled background audio permissions.

The professional metronome that has no issues: https://link.zhihu.com/?target=https%3A//apps.apple.com/cn/app/pro-metronome-%25E4%25B8%2593%25E4%25B8%259A%25E8%258A%2582%25E6%258B%258D%25E5%2599%25A8/id477960671

When dealing with audio, where precise timing is vital, you need to lean in to the real-time support provided by the audio subsystem. So, rather than setting a general purpose timer and playing audio when it fires, you set up the audio subsystem to call you in a real-time context and then you render the tick if it’s the right time.

I’m not an expert in the audio subsystem, so I’ve retagged your question to see if can attract the attention of such an expert.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"

Hello @GOVO, thank you for your post. As Quinn pointed out, using a real-time audio thread is arguably a better alternative than using timers for metronome apps.

There are different technologies you can use on iOS for this. AVAudioSourceNode is a high-level API that allows you to provide audio data to an AVAudioEngine. A lower-level option is to install a render callback on an AudioUnit using kAudioUnitProperty_SetRenderCallback.

Please note that when running an engine or audio unit in real time, processing must occur in a real-time safe context to ensure glitch-free performance. Don't allocate memory, perform file I/O, take locks, or interact with the Swift or Objective-C runtimes when rendering audio in a real-time safe context.

Help Needed: How to Make iOS Timer More Stable?
 
 
Q