Hi, I am developing an application in Objective-C in which at a given moment I have to play samples (its duration is 1 second aprox) when the user hits a pad of his instrument (which has several pads).
I receive the notes through a MIDI interface (CoreMIDI-Wrapper, available in Github) and to play the samples I'm doing the following:
Previously I have an AVAudioEngine instantiated and in its mixer node (AVAudioMixerNode) I have attached 16 AVAudioPlayerNode.
I have an integer (iCurrentNode ranging from 0 to 15) that points to the current AVAudioPlayerNode. When the user hits a pad I do:
[aAVAudioPlayerNode[iCurrentNode] scheduleBuffer:PCMbuffer_note atTime:nil completionHandler{}];
iCurrentNode++;
if(iCurrentNode == 16)iCurrentNode = 0;
The reason for the 16 playback nodes is that scheduleBuffer does not allow overlapping sounds. Therefore, if the user hits a pad and hits it again before the first sound has finished playing, that is queued instead of being overlapped.
The implementation gives me too lag on iPad Pro 10,5" (about half a second). I think I have done a bad approach to the problem...
Somebody could help me? Thanks