Need help to fix Beat visual in Metronome App

Hi , i have refered sample source code of Hello metronome App from app, i have customized the App according to my Requirements but i need help in fixing the Beat visuals, for example when playing at lowe bpm like less than 100 the beat visual appears first and then the sound comes frok the app (there is an delay between Beat visual and Sound from app). i need help to make the beat visual and sound to trigger at the same time, i have atatched my current code please do the needfull.


// schedule beats

func scheduleBeats() {

if (!isOn) { return }

while (beatsScheduled < beatsToScheduleAhead) {

// Schedule the beat.

let secondsPerBeat = 60.0 / Float32(tempoBPM)

let samplesPerBeat = Float(secondsPerBeat * Float(bufferSampleRate))

let beatSampleTime: AVAudioFramePosition = AVAudioFramePosition(nextBeatSampleTime)

let playerBeatTime: AVAudioTime = AVAudioTime(sampleTime: AVAudioFramePosition(beatSampleTime), atRate: bufferSampleRate)

// This time is relative to the player's start time.


// playing beat for showing in visual

let current_visual = self.bufferNumber

print("Self \(self.bufferNumber)")

player.scheduleBuffer(soundBuffer[self.bufferNumber]!, at: playerBeatTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: {

self.syncQueue!.sync() {

self.beatsScheduled -= 1

// if last next beat is less than time signature increment beat

if(self.bufferNumber+1 < self.timeSignature)

{

self.bufferNumber += 1

}

else if(self.timeSignature == 1)

{

self.bufferNumber = 0

}

else

{

self.bufferNumber = 0

}

// once buffer is complete

self.scheduleBeats()

}

})

beatsScheduled += 1

if (!playerStarted) {

// start player if not played

player.play()

playerStarted = true

}

// animate beat in viewcontroller , pass current beat to the View controller.

let nodeBeatTime: AVAudioTime = player.nodeTime(forPlayerTime: playerBeatTime)!

let output: AVAudioIONode = engine.outputNode

let latencyHostTicks: UInt64 = AVAudioTime.hostTime(forSeconds: output.presentationLatency)

let dispatchTime = DispatchTime(uptimeNanoseconds: nodeBeatTime.hostTime + latencyHostTicks)

DispatchQueue.global(qos: .userInitiated).asyncAfter(deadline: dispatchTime) {

if (self.isOn) {

self.delegate?.works(self, beat: self.bufferNumber)

}

}

nextBeatSampleTime += Float64(samplesPerBeat)

}

}

Replies

To synchronize audio output and display visuals, one needs to know the latency of each (or the difference between the two) to physical output (air pressure deltas and photons emited).


Note that the device visual display updates are quantized in time to a frame rate, and the GPU graphic pipeline (via Core Graphics or Metal) has a certain number of frames of latency. 3 frames at 60 Hz would be 50 mS of added latency after any calls to your UIViews drawRect callback. Add to that possibly up to 1 frame of jitter after the dispatchTime of your queue dispatch to the UI thread due to display frame rate display time quantization.


Rather than try to dispatch a computed time from the audio creation logic, another common solution is to use a periodic display frame time call, such as CADisplayLink, to monitor the time of the next upcoming audio tick, and call setNeedsDisplay (or the appropriate Metal draw calls) a sufficient number of frames in advance to change what is drawn to the display.