Post

Replies

Boosts

Views

Activity

ShazamKit during AVCaptureSession - Recognize audio while using camera
Hi, I want to implement ShazamKit in my project. But I have some problems. I use AVCaptureSession to take photos in my app and I'm unable to use ShazamKit. I tried to use three different ways Use an AVAudioEngine during my AVCaptureSession But I didn't obtain any result from Shazam. Try to use ShazamKit after stopping my AvCaptureSession but this causes some problems, and some crashes. Try to use the buffer of my AVCaptureSession to catch audio directly without use AVAudioEngine. This is the code that I use with AVAudioEngine: try! audioSession.setActive(true, options: .notifyOthersOnDeactivation)                 let inputNode = self.audioEngine.inputNode                 let recordingFormat = inputNode.outputFormat(forBus: 0)                                 let audioFormat = recordingFormat //AVAudioFormat(standardFormatWithSampleRate: self.audioEngine.inputNode.outputFormat(forBus: 0).sampleRate,                     //                            channels: 1)                                  inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in                     try! self.signatureGenerator.append(buffer, at: nil)                                          self.session.matchStreamingBuffer(buffer, at: nil)                 }                              self.audioEngine.prepare()                 try! self.audioEngine.start() I can choose two ways to do this, use AVCaptureSession output to pass it to ShazamKit or use an AVAudioSession after the stop of AVCaptureSession. So I have two questions: Can I use a CMSampleBufferRef from AVCaptureSession buffer in a SHSession? And if the answer is yes how? How can I prevent this error if I want to use an AVAudioSession after I stopped my AVCaptureSession? [aurioc]            AURemoteIO.cpp:1117  failed: -10851 (enable 1, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>) [avae]            AVAEInternal.h:76    required condition is false: [AVAEGraphNode.mm:834:CreateRecordingTap: (IsFormatSampleRateAndChannelCountValid(format))] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' Thanks
1
0
1.4k
Jul ’21
Unable to run my app on iOS 13 with Xcode 12 beta 6
I'm trying to test my project on my iPhone with iOS 13, using Xcode 12 beta 6. The app crash on start and this is the message in the log: dyld: launch, loading dependent libraries DYLDLIBRARYPATH=/usr/lib/system/introspection DYLDINSERTLIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/GPUTools.framework/libglInterpose.dylib:/usr/lib/libMTLCapture.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib (lldb) Someone can help me? Thank you
2
0
1.5k
Sep ’20
Xcode 12 beta 4 - Unable to upload to TestFlight
I'm attempting to upload a beta of my app for iOS 14 to TestFlight distribution. While Xcode trying to upload the ipa it show an error "App Store Connect Operation Error - An error occurred uploading to the App Store." How can I solve it ? I also tried to use "Transporter" but I can't login in the app for this error "This action cannot be completed at this time (-22411)" Thanks
12
0
3.3k
Aug ’20
Create ML Style Transfer - Reduce model size
Hello, I'm creating models using Create ML with the new Style Transfer. In the WWDC video it can be seen that the size of the model can be less than 1MB. Each model that I try to create is 6.7MB of output size, even changing the parameters and sources to train the models, the size remains high and always identical. I would like to know if it is a beta bug or if there are parameters that can reduce the size of the models to be imported into Xcode. Thanks
0
0
816
Aug ’20
Receive sub link from widget in App Delegate
Hi, I'll try to implement new widgets in my project. My app is based on objc and swift. I want to use new "Link" of SwiftUI 2 to receive different URLs from the widget to the AppDelegate of my app. According with the min 12:40 of this video, I want multiple links in medium and large size of my widget: developer.apple.com/videos/play/wwdc2020/10028 Like the project "Emoji ranger" from session "Widgets Code-Along". This is an example of my code: HStack {             Link(destination: URL(string: "urlschemes://link1")!, label: {                 Image("landscape")             })             Link(destination: URL(string: "urlschemes://link2")!, label: {                 Image("portrait")             })         } Where "urlschemes" is the same of URL Schemes in URL types of my app target info. So I want to receive the selected link on the App Delegate, where I used this method: (BOOL)application:(UIApplication *)app openURL:(NSURL *)url options:(NSDictionary&lt;UIApplicationOpenURLOptionsKey,id&gt; *)options{ &#9;&#9; &#9;&#9;NSLog(@"RECEIVED URL: %@",url); } Thanks in advance Fulvio
3
0
3.5k
Jul ’20
RealityKit and Reality Composer incompatibile with iOS 12 support
Hi,In a new update of my app I want to integrate a new AR function using the Reality Composer, this function will available only for iOS 13, but I have a problem.I want to maintain my project compatible with iOS 12, but I'm unable to do this.When I set the compatibility with iOS 12 the swift file auto-generated by the .rcproject give me a lots of errors "... is only available in iOS 13.0 or newer". I tried to add "@available(iOS 13.0, *)" but when I rebuild the project xCode restores the original file and returns me the same errors "... is only available in iOS 13.0 or newer".There is a solution for these?Thanks in advanceFS
3
0
1.6k
Jul ’19