Hi, I want to implement ShazamKit in my project. But I have some problems.
I use AVCaptureSession to take photos in my app and I'm unable to use ShazamKit.
I tried to use three different ways
- Use an AVAudioEngine during my AVCaptureSession
But I didn't obtain any result from Shazam.
-
Try to use ShazamKit after stopping my AvCaptureSession but this causes some problems, and some crashes.
-
Try to use the buffer of my AVCaptureSession to catch audio directly without use AVAudioEngine.
This is the code that I use with AVAudioEngine:
try! audioSession.setActive(true, options: .notifyOthersOnDeactivation)
let inputNode = self.audioEngine.inputNode
let recordingFormat = inputNode.outputFormat(forBus: 0)
let audioFormat = recordingFormat //AVAudioFormat(standardFormatWithSampleRate: self.audioEngine.inputNode.outputFormat(forBus: 0).sampleRate,
// channels: 1)
inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in
try! self.signatureGenerator.append(buffer, at: nil)
self.session.matchStreamingBuffer(buffer, at: nil)
}
self.audioEngine.prepare()
try! self.audioEngine.start()
I can choose two ways to do this, use AVCaptureSession output to pass it to ShazamKit or use an AVAudioSession after the stop of AVCaptureSession.
So I have two questions:
- Can I use a CMSampleBufferRef from AVCaptureSession buffer in a SHSession?
And if the answer is yes how?
- How can I prevent this error if I want to use an AVAudioSession after I stopped my AVCaptureSession?
[aurioc] AURemoteIO.cpp:1117 failed: -10851 (enable 1, outf< 2 ch, 0 Hz, Float32, deinterleaved> inf< 2 ch, 0 Hz, Float32, deinterleaved>)
[avae] AVAEInternal.h:76 required condition is false: [AVAEGraphNode.mm:834:CreateRecordingTap: (IsFormatSampleRateAndChannelCountValid(format))]
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)'
Thanks