Hi, I am interested in decoding multichannel Higher-Order Ambisonics feeds to the Spatial Audio renderer discussed in the "Immerse your app in spatial audio" WWDC21 talk. However, I could not find any documentation about which multichannel audio formats are actually supported by the renderer, and a search for "Ambisonics" in the developer documentation only contains results pertaining to the Audio DriverKit. Can someone please enlighten me? Thank you!
Support for Ambisonics in Airpods Spatial Audio renderer?
I am also struggling quite a bit with something very similar:
- We have an ambisonics renderer that requires 16 channels.
- Everything works fine using
AVAudioFile
to read a wav file withAVAudioChannelLayout.init(layoutTag: kAudioChannelLayoutTag_HOA_ACN_SN3D | 16)
intoAVAudioPCMBuffer
- I can't find any compressed formats that iOS is able to read at 16 channels. Whilst it seems possible to create an aac file with up-to 48 channels iOS doesn't seem able to decode anything with more than 8 channels (7.1) as far as I can tell?
- We considered using 2x8 channel aac files, reading them into 2 buffers and then joining them but have been unable to figure out how to merge them back to 16 channels with the correct layout.
- I'm currently experimenting with trying to use Opus for the compression but my experiments here have also been unsuccessful so far.
Any help anyone can give on how we could go about getting 16 channels of compressed audio out of a file (or files) and into an AVAudioPCMBuffer
with kAudioChannelLayoutTag_HOA_ACN_SN3D | 16
layout would be very amazing.
I am also attempting to get ambisonic audio playback working with AVFoundation / AVFAudio with the AVAudioEngine through an AVAudioEnvironmentNode. I believe I have gotten a .WAV PCM Ambisonics B_Format (4 Channel) audio file playing back properly as shown in the code below. However, rotating the listeners head via listenerAngularOrientation.yaw doesn't affect the ambisonic audio rotation.
private let AmbiPlayerNode = AVAudioPlayerNode()
func playAmbiAudio(){
guard let AmbiFileURL = Bundle.main.url(forResource: "crowdAmbiOut", withExtension: "wav")
else{
print("read ambi failed")
return
}
let file = try! AVAudioFile(forReading: AmbiFileURL)
let layout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_Ambisonic_B_Format)
let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, interleaved: false, channelLayout: (layout)!)
guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: UInt32(file.length)) else
{
print("buffer failed")
return
}
try! file.read(into: buffer)
engine.attach(AmbiPlayerNode)
engine.connect(AmbiPlayerNode, to: environment, format: file.processingFormat)
AmbiPlayerNode.scheduleBuffer(buffer, at: nil, options: .loops, completionHandler: nil)
AmbiPlayerNode.sourceMode = .ambienceBed
AmbiPlayerNode.play()
}
I haven't mastered the Apple Developer forums formatting. Here is a more readable link.
Hi All,
I have built a framework exactly for this purpose, supporting up to 7th order playback using a custom ambisonic decoder & opus compressed files - please get in touch if you'd like to beta test it.
Hi Oli,
That sounds amazing, I would love to take a look.
I’m currently working with quadrophonic audio for installations but starting to look into rendering ambisonics on iOS.
We now maintain a patched version of opusfile that supports decoding of up-to 16 channels and I can confirm it works great on iOS.