I am trying to use AVAudioEnvironmentNode with ARKit.
I am not using ARSCNKit, just ARKit with a regular UIKit.
There problem is when I set the AVAudioEnvironmentNode.listenerPosition to matching ARFrame's camera forward, up, and position using the ARSessionDelegate and session(_ session: ARSession, didUpdate frame: ARFrame), AVAudioEnvironmentNode does not seem to hold its position.
Audiably it sounds like it is jumping from the position I try to set it at, and back to the origin.zero, with the left and right channels giving out sproatically.
If I print out listenerVectorOrientation, and listenerPosition after I set it, the vectors for the orientation pretty much matches the camera (good), but the listenerPosition never wavers from Vector(0,0,0) (bad)
Does anyone what is going on here, or at least set the AVAudioEnvironmentNode.listenerPosition so it sticks, and just plain how do I stop this problem?
I did some more testing, and I think there is a bug or something with AVAudioEnvironmentNode.listenerVectorOrientation.
If I set listenerVectorOrientation, the audio positioning just goes everywhere. But if I convert the Vectors to Euler angles, and use listenerAngularOrientation everything works more or less correctly.
Similarly... Is AVAudioEnvironmentNode fixed for portrait position, or something because the Camera's Yaw, Pitch, and Roll, do not correspond to the AVEnviromentNode's.
Currently I am having to map.
cameraPitch => audioYaw
cameraRoll => audioPitch
cameraYaw => audioRoll.
and that really doesn't seem right.