Swift, AVFoundation and IAA

Hi Folks,


I hope this cross-post from this old Core Audio thread is OK. I think it deserves a new thread as it's more of an AVFoundation question.


I'm hoping someone will be able to point me in the right direction as I'm really struggling to find any working examples of supporting IAA Audio in a Swift app.


WIth a bit of digging and looking at Obj-C examples, I've managed to get my generator AU published OK so it's being recognized by a host app (AUM) and while it can't launch my app, it seems to connect to it if I launch it manually. However the audio from my app isn't showing up in the host's mixer channel although it's still audible via the shared audio session. I get the route change notification when the host app launches, and I guess I need to do *something* in there, but I'm not sure what... :-)


I create my audio graph using AVFoundation nodes and all that seems to be working fine. However, I can't figure out whether I can apply the same approach I've seen in the Core Audio IAA graph examples.


I also set up my audio session using the shared instance, playback category and mix with others. This also seems to be working fine.


If anyone can please point me in the right direction, or even share a snippet of a working example, I would be very, very, grateful. I've been on this for a couple of days now and close to giving up / banging my head on the desk until I don't know what IAA stands for anymore :-)


Thanks, -Rob


PS. Here's the core audio thread, and so far, I'm the only responder :-/ https://forums.developer.apple.com/message/132839#132839

Still banging my head... I'm guessing this just isn't possible. I keep trying, but getting nowhere.


On a positive note, if you Google "IAA Swift example", this very question is the top (and only relevant) hit.

I guess that means I should give up.

I am in the same situation, specifically, "I'm hoping someone will be able to point me in the right direction as I'm really struggling to find any working examples of supporting IAA Audio in a Swift app.". It's my first iOS app. I'm using swift 3. My app downloads audio files (rhythmic backing tracks) and plays them and I want to use IAA as a host so that the audio can be more useful. I've really taken to Swift but never been happy with Objective-C, and I've followed the Objective-C tutorial but got stuck in the same place and my results are equivalent to yours. I must admit I was hoping that, since Apple supports Swift 3 and supports IAA, they'd support IAA/Swift3, not just IAA/Objective-C. I am considering opening a TSI specifically for IAA/Swift3. Is that a good idea?

All the best, Matt

Random thoughts: Inter-app audio (IAA) seems to currently require an AUGraph. In the 2017 WWDC What's New in Core Audio session, Apple said they will be depricating the AUGraph API. In that same WWDC session, Apple said not to use Swift or Objective C code inside the audio context (meaning Audio Unit callbacks?). This suggests that only C code (bounded latency code that doesn't block and requires no memory calls/management, etc.) should be used in an AUGraph, not Swift 3.


Perhaps the solution is to publish an AVAudioUnit subclass instead of trying to use IAA? Will that work?

Whoa... I totally missed this... if AUGraph is being deprecated, then does that mean the whole AVFoundation "nodes" architecture is going down too?

I got IAA working by the way, by listening for the IAA property change and publishing may main mixer output audio unit. Great that it's now deprecated.
Swift, AVFoundation and IAA
 
 
Q