How to Ignore silence in audio buffer “background noise reduction” - iOS Chromatic Tuner? Ask

This is simple chromatic tuner for iOS (xCode). I want to make "ambient noise reduction", to ignore silence or low volume packets in audio buffer. Now they are not ignored and tuner gets crazy with random audio packages, I want tuner to work only when there is a valid sound not noise.


- (instancetype)initWithPitchTrackingHandler:(void (^)(NSNumber *pitch))pitchHandler {
    if (self = [super init]) {
        self.pitchUpdateHandler = pitchHandler;
        [self setupRemoteIO];
        dywapitch_inittracking(&_tracker);
    }
    return self;
}
- (void) dealloc {
/
}
- (void) processAudio:(AudioBufferList*)bufferList {
   
    /
    if (!self.buffer) {
        self.buffer = [NSMutableData dataWithBytes:bufferList->mBuffers[0].mData
                                            length:bufferList->mBuffers[0].mDataByteSize];
    } else {
        [self.buffer appendBytes:bufferList->mBuffers[0].mData
                          length:bufferList->mBuffers[0].mDataByteSize];
    }
   
    /
    int numberOfSamples = self.buffer.length/kBytesPerSample;
    if (numberOfSamples >= kSamplesNeeded) {
       
        /
        float *samplesAsFloat = (float *)self.buffer.mutableBytes;
        double *samplesAsDouble = (double *)malloc(numberOfSamples * sizeof(double));
        for (int i = 0; i < numberOfSamples; i++) {
            samplesAsDouble[i] = (double)samplesAsFloat[i];
        }
   
        /
        double pitch = dywapitch_computepitch(&_tracker, samplesAsDouble, 0, numberOfSamples);
       
        if (self.pitchUpdateHandler && pitch != 0.0) {
           
            dispatch_async(dispatch_get_main_queue(), ^{
                self.pitchUpdateHandler(@(pitch));
            });
        }
        /
        self.buffer = nil;
    }
}
@end

Replies

One simple possible solution is to compute an audio volume (sum of magnitudes squared), and not call the pitch estimator if it's too low.


Another problem is that you appear to be allocating memory and calling Objective C methods inside the audio callback. If you review the latest WWDC session on Core Audio, Apple seems to recommend not using either Objective C or Swift inside the audio context, and not doing anything that requires memory managament, such as a malloc.