Spatial Audio Head Tracking With AVAudioEngine

I want to create a sort of soundscape in surround sound. Imagine something along the lines of the user can place the sound of a waterfall to their front right and the sound of frogs croaking to their left etc. etc.

I have an AVAudioEngine playing a number of AVAudioPlayerNodes. I'm using AVAudioEnvironmentNode to simulate the positioning of these. The position seems to work correctly. However, I'd like these to work with head tracking so if the user moves their head the sounds from the players move accordingly.

I can't figure out for to do it or find any docs on the subject. Is it possible to make AVAudioEngine output surround sound and if it can would the tracking just work automagically the same as it does when playing surround sound content using AVPlayerItem. If not is the only way to achieve this effect to use CMHeadphonemotionmanager and manually move the listener AVAudioEnvironmentNode listener around?

For that kind of soundscape, I think your best choice might be PHASE:

https://developer.apple.com/documentation/phase
https://developer.apple.com/videos/play/wwdc2021/10079/

For head tracking, CMHeadphoneMotionManager is the way to go.

Spatial Audio Head Tracking With AVAudioEngine
 
 
Q