Audio Unit Event API not available on iOS?

The Audio Unit Event API (AudioToolbox/AudioUnitUtilities.h) seems to be macOS only.

Does iOS provides a mechanism for AUv3 hosts to listen to an Audio Unit's parameter changes, aside from polling?

Accepted Reply

Good question, AFAICT AUParameterTree is Obj-C API, I don't think there's corresponding C API.

Replies

Yes it does, check out the observer methods here: https://developer.apple.com/documentation/audiotoolbox/auparameternode

Ah OK, thanks very much! Do you know if there's a way of doing this in C++? I've got an AudioUnit object, not an AUAudioUnit.

Good question, AFAICT AUParameterTree is Obj-C API, I don't think there's corresponding C API.

Sadly it doesn't seem possible to extract a C++ AudioUnit pointer from an AUAudioUnit, so I'd have to rewrite my otherwise fully functional host in a different language just to listen to parameter changes.


My (probably selfish) preference would be that time had been spent on documenting the C++ interface rather than adding yet more undocumented APIs in two different languages, both unsuited to audio applications and possibly unfamiliar to the target DSP programmer audience. But what do I know!


Thanks once again for your help.

Didn't they say, in this years WWDC Core Audio What's New session, not to use Objective C (or Swift) in the audio context?


How then to get anything into or out of AUParameters on the audio side? Use ANSI C struct access from a pointer syntax to reach inside the memory of an Objective C object? Or is there some other way?

I haven't seen that session, and I probably should to help understand the direction the API's taking because right now it's unclear to me.


When looking for AUv3 documentation I couldn't help but observe that that the AUv3 host/plugin sample code (which basically IS the bulk of the documentation) is only available in Swift.


I can only guess that the principle is that all the 'offline' setup is performed via a realtime unsafe language, and then the rest is done with something fit for purpose (C/C++/ASM/intrinsics). If there are still cases where Obj C/Swift can pollute the audio context then this hasn't been thought through.


I appreciate that the C/C++ interface has never been particularly pretty (or typesafe), but it does work, and when compared to the work required to build a production quality Core Audio/DSP implementation, some 'ugly' Get/SetProperty() calls are neither here nor there. We're DSP programmers - we can take it!


What we can't take is days lost simply because a significant proportion of the API is documented as 'No overview available.'