I multiplied the samples by the AVPlayer's volume level and then do an RMS calculation on the resulting samples.
I send the leftVol and rightVol values off to a NSLevelIndicator instances -- on which I then do a pair of clever NSView overlays to get a lagging indication of peak volume levels.
float scalar = player.volume;
for (NSInteger i=0; i<bufferListInOut->mNumberBuffers; i++)
vDSP_vsmul(bufferListInOut->mBuffers[i].mData, 1, &scalar,
bufferListInOut->mBuffers[i].mData, 1,
bufferListInOut->mBuffers[i].mDataByteSize / sizeof(float));
float leftVol, rightVol;
for (NSInteger i=0; i<bufferListInOut->mNumberBuffers; i++) {
AudioBuffer *pBuffer = &bufferListInOut->mBuffers[i];
float rms = 0.0f;
vDSP_rmsqv(pBuffer->mData, 1, &rms, numberFrames*pBuffer->mNumberChannels);
if (i==0)
leftVol = rms;
if (i==1)
rightVol = rms;
}
Post
Replies
Boosts
Views
Activity
One comment: having a DataSource Delegate and/or a NSCollectionViewDelegate attached to your post-legacy NSCollectionView is a bad idea; it needs to be in a different class/object. In my case, such a situation exhausted available memory with leaked objects.
Try this:https://developer.apple.com/library/archive/samplecode/avloopplayer/Introduction/Intro.html