I'm writing an app to read a WAVE/AIFF file from disk and I want to do some math on the amplitude of each frame (think RMS or some other modulation). I don't want to write a parser.
I've got some code that uses the AudioStreamBasicDescription structures and converts from one format to another using both Audio Converter Services and Extended Audio File Services. Both of these give me access to buffers, but they are simply blocks of data, with no specific data structures.
If there's a way to access the individual audio frame amplitudes in those buffers, that'd be ideal.
As an example:
ExtAudioFileRead( ExtAudioFileRef inExtAudioFile,
UInt32 * ioNumberFrames,
AudioBufferList * ioData)
I get this to work, and it returns an AudioBufferList, which has a block of data. I just don't know where the amplitudes are in this block.