I have sort of asked this before, but I've simply hit another wall this time around. Here it goes!
While there are tons of information on how to use a circular buffer on the RemoteIO Audio Unit, by way of the callbacks, I don't see any immediately obvious way how to do this with an AUAudioUnit subclass. Has anyone succeeded in doing this?
I see there are two handlers in the AUAudioUnit called inputHandler and outputHandler. But I have had no luck in simply implementing them, such that they copy the data to a circular buffer.
My intent is to accumulate a minimum amount of samples/frames in the input buffer before the render block is called, such that I can perform an FFT with a reasonable amount of sample data. Also, such that overlap-and is possible to do.
I find myself restricted to the render block function, which (apparently, correct me if I'm wrong) insists that I render exactly the amount of incoming samples.
What I'd like to avoid is offsetting the output. I.e. 1) saving previous frame (assuming some zero-data to begin with), 2) pulling input, and outputting a frame in-between the previous and current frame by overlap-add. I'd much rather cause some latency within the plugin instead, by waiting to render until enough data has arrived. However, currently I can also see a way to do the former. The latter is slightly trickier, and I find myself somewhat restricted by the API, and missing some documentation and/or code examples of doing this.
The g-o-a-l is to accumulate more samples than render block asks, before rendering, perform window/FFT/filtering/IFFT and overlap-add.
For some reason the Apple forum doesn't like the word 'g o a l', it said 'invalid characters 😝
Any advice is much appreciated, I find the docs are way too short and doesn't explain in the detail they used to. Especially the audio unit part of the docs.
I'm not agreeing with this new way Apple has made the docs.