I am using AVAudioEngine to analyse some audio tracks and quite often the `while engine.manualRenderingSampleTime < sourceFile.length {` loop takes a moment to start receiving audio data. Looking at the soundwave looks like the input is simply delayed. This wouldn't be a problem if the length of the buffer would vary depending on this latency but the length unfortunately stays always the same, loosing the final part of the track.I took the code from the official tutorial and this seems to happen regardless if I add an EQ effect or not. Actually looks like the two analysis (with or without EQ) done one after the other return the same anomaly. let format: AVAudioFormat = sourceFile.processingFormat
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
engine.attach(player)
if compress {
let eq = AVAudioUnitEQ(numberOfBands: 2)
engine.attach(eq)
let lowPass = eq.bands[0]
lowPass.filterType = .lowPass
lowPass.frequency = 150.0
lowPass.bypass = false
let highPass = eq.bands[1]
highPass.filterType = .highPass
highPass.frequency = 100.0
highPass.bypass = false
engine.connect(player, to: eq, format: format)
engine.connect(eq, to: engine.mainMixerNode, format: format)
}else{
engine.connect(player, to: engine.mainMixerNode, format: format)
}
do {
let maxNumberOfFrames: AVAudioFrameCount = 4096
try engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: maxNumberOfFrames)
} catch {
fatalError("Could not enable manual rendering mode, \(error)")
}
player.scheduleFile(sourceFile, at: nil)
do {
try engine.start()
player.play()
}catch{
fatalError("Could not start engine, \(error)")
}
// buffer to which the engine will render the processed data
let buffer: AVAudioPCMBuffer = AVAudioPCMBuffer(pcmFormat: engine.manualRenderingFormat, frameCapacity: engine.manualRenderingMaximumFrameCount)!
//
var pi = 0
//
while engine.manualRenderingSampleTime < sourceFile.length {
do {
let framesToRender = min(buffer.frameCapacity, AVAudioFrameCount(sourceFile.length - engine.manualRenderingSampleTime))
let status = try engine.renderOffline(framesToRender, to: buffer)
switch status {
case .success:
// data rendered successfully
let flength = Int(buffer.frameLength)
points.reserveCapacity(pi + flength)
if let chans = buffer.floatChannelData?.pointee {
let left = chans.advanced(by: 0)
let right = chans.advanced(by: 1)
for b in 0..<flength {
let v:Float = max(abs(left[b]), abs(right[b]))
points.append(v)
}
}
pi += flength
case .insufficientDataFromInputNode:
// applicable only if using the input node as one of the sources
break
case .cannotDoInCurrentContext:
// engine could not render in the current render call, retry in next iteration
break
case .error:
// error occurred while rendering
fatalError("render failed")
}
} catch {
fatalError("render failed, \(error)")
}
}
player.stop()
engine.stop()I thought it was perhaps a simulator issue, but also on the device is happening. Am I doing anything wrong? Thanks!