Are AVFoundation and AUv3 mutually exclusive tech stacks?

Given that many AVFoundation audio components are built on AudioUnits it occurred to me that it might be "nice" to leverage some of these components as part of an AUv3 implementation and let those guys do some of the heavy lifting, i.e. populating buffers and so on.

A good example is perhaps (my obsession) AVAudioUnitSampler, all the more tempting given that the internalRenderBlock for the underlying AUAudioUnit was recently exposed not so long ago (iOS 11 maybe?)

Two literal questions really I guess...

1) Is trying to use AVFoundation "stuff" as part of an AUv3 implementation like this a "bad idea" ?

2) Alternatively, if this could work "in theory", the thing that's totally unclear to me is how to properly integrate AVAudioEngine given that (please correct me if I'm wrong) AVAudioUnitSampler totally needs AVAudioEngine to be up and running to do its thing. During some hack tests, I was able to get an AV sampler-based AU instantiated and making noise but couldn't seem to route outputs correctly to the host. There were also errors around setting maximumFramesToRender that made me suspect the answer to question 1 is "yes".

And finally, mostly likely a rhetorical question...

3) Is it just me, or does the audio side of AVFoundation, and the current "push" towards AUv3 seem at odds?


Answered by music4sport in 616842022
I think I've got this working now by hooking into the template / generated code. I'm not happy with how / where the AVAudioEngine get started, and there's the frame setting issue which I'm hoping I can "safely" ignore...

Anyhoo - it's kinda working, so I'm unblocked. For now :-)
Quick update on what I tried as a super simple "proof of concept".

I created an AVAudioUnitSampler, and return the auAudioUnit property as my AUAudioUnit in the AudioUnitViewController createAudioUnit() func.

I tried to do this initially in the generated template code, but didn't get anywhere with that. So just to be clear, I'm not using that template kernel and DSP stuff at all - I'm hoping that the AUAudioUnit inside AVAudioUnitSampler already knows how to do that.

Anyway, this seems to instantiate OK in AUM without error. However...

As I mentioned above, the internal AUAudioUnit doesn't seem to be in a usable state and throws this exception when you try calling any of the AVAudioUnitSampler funcs:

[AVAudioNodeImpl.h:253:AVAECheckNodeHasEngine: (engine != nil)]

And I guess that's the point where it all starts to feel a bit "wrong". So basically, I'd love a simple yes / no answer:

Is it possible to expose the underlying AUAudioUnits in an AVFoundation-based app as AUv3s? Yes or no.

If the answer is yes, how / where / when do I set up the AVAudioEngine shared instance?


Just to add some more missing detail. I think I know my way around AVFoundation OK. I have several apps based on the AVFoundation audio players, samplers, mixers, effects and so on. They seem to be working just fine. They even support IAA. Ahem...

So what I basically would like to do, is to expose some of the nodes in my "graph" as AUv3 audio units. Some of the hooks seem to be there as noted above. Normally you'd stop and start your AVAudioEngine instance, tear down your graph and re-build it, for example on an IAA connection.

I'm trying to do something similar, but as an AUv3 Audio Unit. Anyone... anyone?

All well and good saying "IAA is deprecated - use AUv3" - how do I make the switch in the case of my apps, when they're all built on AVFoundation?
Accepted Answer
I think I've got this working now by hooking into the template / generated code. I'm not happy with how / where the AVAudioEngine get started, and there's the frame setting issue which I'm hoping I can "safely" ignore...

Anyhoo - it's kinda working, so I'm unblocked. For now :-)
Are AVFoundation and AUv3 mutually exclusive tech stacks?
 
 
Q