Post

Replies

Boosts

Views

Activity

Does Apple's Speech framework support speech recognition to work in the background of the App?
I wrote a class for speech recognition with the Speech framework. // Cancel the previous task if it's running. if (self.recognitionTask) { //[self.recognitionTask cancel]; // Will cause the system error and memory problems. [self.recognitionTask finish]; } self.recognitionTask = nil; // Configure the audio session for the app. NSError *error = nil; [AVAudioSession.sharedInstance setCategory:AVAudioSessionCategoryRecord withOptions:AVAudioSessionCategoryOptionDuckOthers error:&error]; if (error) { [self stopWithError:error]; return; } [AVAudioSession.sharedInstance setActive:YES withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&error]; if (error) { [self stopWithError:error]; return; } // Create and configure the speech recognition request. self.recognitionRequest = [[SFSpeechAudioBufferRecognitionRequest alloc] init]; self.recognitionRequest.taskHint = SFSpeechRecognitionTaskHintConfirmation; // Keep speech recognition data on device if (@available(iOS 13, *)) { self.recognitionRequest.requiresOnDeviceRecognition = NO; } // Create a recognition task for the speech recognition session. // Keep a reference to the task so that it can be canceled. __weak typeof(self)weakSelf = self; self.recognitionTask = [self.speechRecognizer recognitionTaskWithRequest:self.recognitionRequest resultHandler:^(SFSpeechRecognitionResult * _Nullable result, NSError * _Nullable error) { // ... if (error != nil || result.final) { } }]; I want to know if the Speech framework supports background tasks. If support, how do I modify the iOS code?
0
0
615
Jul ’21
AVAudioEngine crash in iOS14.
Version 12.0 beta (12A6159) iOS14 Simulator <pre> (void)startAudioEngine {   NSError *error = nil;   if (!self.audioEngine.isRunning) {     self.audioEngine = [[AVAudioEngine alloc] init];     AVAudioInputNode *inputNode = self.audioEngine.inputNode;     AVAudioFormat *nativeAudioFormat = [inputNode inputFormatForBus:0];     _weak typeof(self)weakSelf = self;     [inputNode installTapOnBus:0 bufferSize:1024 format:nativeAudioFormat block:^(AVAudioPCMBuffer * Nonnull buffer, AVAudioTime * _Nonnull when) {       [weakSelf.recognitionRequest appendAudioPCMBuffer:buffer];     }];     [self.audioEngine prepare];     [self.audioEngine startAndReturnError:&error];     if (error) {       [self stop];       [self onError:[NSError errorWithDomain:@"startAudioEngine error" code:0 userInfo:nil]];     }     else {       [self activeStatusChanged:MMSpeechRecognizerActiveStatusStared];     }   }   else {     [self stop];     [NSError errorWithDomain:@"The audio engine is runing" code:0 userInfo:nil];   } } </pre> Crashed at self.audioEngine.inputNode line.
8
0
3.2k
Jun ’20