I can currently write, using AVAudioFile, to any of the file formats specified by Core Audio.
It can create files in all formats (except one, see below ) that can be read into iTunes, Quicktime and other apps and played back.
However some formats appear to be ignoring values in the AVAudioFile settings dictionary.
• An MP4 or AAC will save and write successfully at any sample rate but any bit rates I attempt to specify are ignored.
• Wave files saved with floating point data are always converted to Int32 even though I specify float. Even though the PCM buffers I’m using as input and output for sample rate conversion are float on input and output. So the AVAudioFile is taking Float input but converting it to Int for some reason I can’t fathom.
• The only crash/exception/failure I see is if I attempt to create an AVAudioFile as WAV/64 bit float. … bang, AVAudioFile isn’t having that one!
The technique I’m using is:
• Create AVAudioFile for writing with a settings dictionary.
• Get processing and file format from AVAudioFile
• Client format is always 32 bit Float, AVAudioFile generally reports its processing format as some other word sized Float format at the sample rate and size I’ve specified in the fileFormat.
• Create a converter to convert from client format to processing format.
• Process input data through the converter to the file using converter.convert(to: , error:&error, withInputFrom )
So this works … sort of
The files ( be they wav, aiff, flac, pp3, aac, mp4 etc ) are written out and will play back just fine.
… but …
If the processing word format is Float, in a PCM file like WAV, the AVAudioFile will always report its fileFormat as Int32.
And if the file is a compressed format such as mp4/aac, any bit rates I attempt to specify are just ignored but the sample rate appears to be respected as if the converters/encoders just choose a bit rates based on sample rate.
So after all that waffle, I've missed something that's probably meant to be obvious, so my questions are …
• For lpcm float formats why is Int32 data written even though the AVAudioFile settings dictionary has AVLinearPCMIsFloatKey to true ?
• How do arrange the setup so that I can specify the bit rate for compressed audio?
The only buffers I actually create are both PCM, the client output buffer, and the AVAudioConverter/AVAudioFile processing buffer.
I’ve attempted using AVAudioCompressedBuffer but haven’t had any luck.
I hope someone has some clues because I’ve spent more hours on this than anyone should ever need to!
For my Christmas present I’d like Core Audio to be fully and comprehensively documented please!