Posts

Post not yet marked as solved
6 Replies
15k Views
Is it possible to use an iPhone running iOS 17 with Xcode 14.3.1? I tried to use the old method but nothing. In Xcode 15 inside DeviceSupport folder there isn't the folder of iOS 17.
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
Hi, I want to implement ShazamKit in my project. But I have some problems. I use AVCaptureSession to take photos in my app and I'm unable to use ShazamKit. I tried to use three different ways Use an AVAudioEngine during my AVCaptureSession But I didn't obtain any result from Shazam. Try to use ShazamKit after stopping my AvCaptureSession but this causes some problems, and some crashes. Try to use the buffer of my AVCaptureSession to catch audio directly without use AVAudioEngine. This is the code that I use with AVAudioEngine: try! audioSession.setActive(true, options: .notifyOthersOnDeactivation)                 let inputNode = self.audioEngine.inputNode                 let recordingFormat = inputNode.outputFormat(forBus: 0)                                 let audioFormat = recordingFormat //AVAudioFormat(standardFormatWithSampleRate: self.audioEngine.inputNode.outputFormat(forBus: 0).sampleRate,                     //                            channels: 1)                                  inputNode.installTap(onBus: 0, bufferSize: 1024, format: audioFormat) { (buffer: AVAudioPCMBuffer, when: AVAudioTime) in                     try! self.signatureGenerator.append(buffer, at: nil)                                          self.session.matchStreamingBuffer(buffer, at: nil)                 }                              self.audioEngine.prepare()                 try! self.audioEngine.start() I can choose two ways to do this, use AVCaptureSession output to pass it to ShazamKit or use an AVAudioSession after the stop of AVCaptureSession. So I have two questions: Can I use a CMSampleBufferRef from AVCaptureSession buffer in a SHSession? And if the answer is yes how? How can I prevent this error if I want to use an AVAudioSession after I stopped my AVCaptureSession? [aurioc]            AURemoteIO.cpp:1117  failed: -10851 (enable 1, outf< 2 ch,      0 Hz, Float32, deinterleaved> inf< 2 ch,      0 Hz, Float32, deinterleaved>) [avae]            AVAEInternal.h:76    required condition is false: [AVAEGraphNode.mm:834:CreateRecordingTap: (IsFormatSampleRateAndChannelCountValid(format))] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: IsFormatSampleRateAndChannelCountValid(format)' Thanks
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
12 Replies
2.8k Views
I'm attempting to upload a beta of my app for iOS 14 to TestFlight distribution. While Xcode trying to upload the ipa it show an error "App Store Connect Operation Error - An error occurred uploading to the App Store." How can I solve it ? I also tried to use "Transporter" but I can't login in the app for this error "This action cannot be completed at this time (-22411)" Thanks
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
2 Replies
1.3k Views
I'm trying to test my project on my iPhone with iOS 13, using Xcode 12 beta 6. The app crash on start and this is the message in the log: dyld: launch, loading dependent libraries DYLDLIBRARYPATH=/usr/lib/system/introspection DYLDINSERTLIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/GPUTools.framework/libglInterpose.dylib:/usr/lib/libMTLCapture.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib (lldb) Someone can help me? Thank you
Posted
by RedSun.
Last updated
.
Post marked as solved
3 Replies
3.2k Views
Hi, I'll try to implement new widgets in my project. My app is based on objc and swift. I want to use new "Link" of SwiftUI 2 to receive different URLs from the widget to the AppDelegate of my app. According with the min 12:40 of this video, I want multiple links in medium and large size of my widget: developer.apple.com/videos/play/wwdc2020/10028 Like the project "Emoji ranger" from session "Widgets Code-Along". This is an example of my code: HStack {             Link(destination: URL(string: "urlschemes://link1")!, label: {                 Image("landscape")             })             Link(destination: URL(string: "urlschemes://link2")!, label: {                 Image("portrait")             })         } Where "urlschemes" is the same of URL Schemes in URL types of my app target info. So I want to receive the selected link on the App Delegate, where I used this method: (BOOL)application:(UIApplication *)app openURL:(NSURL *)url options:(NSDictionary&lt;UIApplicationOpenURLOptionsKey,id&gt; *)options{ &#9;&#9; &#9;&#9;NSLog(@"RECEIVED URL: %@",url); } Thanks in advance Fulvio
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
3 Replies
1.3k Views
Hi,In a new update of my app I want to integrate a new AR function using the Reality Composer, this function will available only for iOS 13, but I have a problem.I want to maintain my project compatible with iOS 12, but I'm unable to do this.When I set the compatibility with iOS 12 the swift file auto-generated by the .rcproject give me a lots of errors "... is only available in iOS 13.0 or newer". I tried to add "@available(iOS 13.0, *)" but when I rebuild the project xCode restores the original file and returns me the same errors "... is only available in iOS 13.0 or newer".There is a solution for these?Thanks in advanceFS
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
0 Replies
659 Views
Hello, I'm trying to upload my app beta to AppStore Connect to have the new iOS 14 widgets tested on TestFlight. With Xcode beta 4 and "Transporter" I can't load the ipa. The only solution was to load the archive using Xcode 11 but widgets doesn't appear from the iOS home screen. Can anyone help me? Thank you so much
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
0 Replies
693 Views
Hello, I'm creating models using Create ML with the new Style Transfer. In the WWDC video it can be seen that the size of the model can be less than 1MB. Each model that I try to create is 6.7MB of output size, even changing the parameters and sources to train the models, the size remains high and always identical. I would like to know if it is a beta bug or if there are parameters that can reduce the size of the models to be imported into Xcode. Thanks
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
0 Replies
1.3k Views
Hi,Many users of my app are having crash problems. It happens or during the opening of the app or during the use.I'm unable to replicate the error because the problem does not occur on my devices, and I'm not managing to find a solution.Most users will solve the problem by reinstalling the app.In the crashlog the method that causes the bug is `[AVCaptureFigVideoDevice temperatureAndTintValuesForDeviceWhiteBalanceGains:]` called in `[ViewController observeValueForKeyPath:ofObject:change:context:]` but sometimes the crash occurs in other line of the method `observeValueForKeyPath`.So I think the problem is caused by the observers.I add the observer in the viewWillAppear and remove on viewDidDisappear - (void)addObservers { [self addObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext]; [self addObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:CapturingStillImageContext]; [self addObserver:self forKeyPath:@"movieFileOutput.recording" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:RecordingContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.focusMode" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:FocusModeContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.lensPosition" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:LensPositionContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.exposureMode" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:ExposureModeContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.exposureDuration" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:ExposureDurationContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.ISO" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:ISOContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.exposureTargetOffset" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:ExposureTargetOffsetContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.whiteBalanceMode" options:(NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:WhiteBalanceModeContext]; [self addObserver:self forKeyPath:@"videoDeviceInput.device.deviceWhiteBalanceGains" options:(NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew) context:DeviceWhiteBalanceGainsContext]; [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(subjectAreaDidChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[self videoDevice]]; } - (void)removeObservers { [[NSNotificationCenter defaultCenter] removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:[self videoDevice]]; [[NSNotificationCenter defaultCenter] removeObserver:[self runtimeErrorHandlingObserver]]; [self removeObserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" context:SessionRunningAndDeviceAuthorizedContext]; [self removeObserver:self forKeyPath:@"stillImageOutput.capturingStillImage" context:CapturingStillImageContext]; [self removeObserver:self forKeyPath:@"movieFileOutput.recording" context:RecordingContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.focusMode" context:FocusModeContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.lensPosition" context:LensPositionContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.exposureMode" context:ExposureModeContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.exposureDuration" context:ExposureDurationContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.ISO" context:ISOContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.exposureTargetOffset" context:ExposureTargetOffsetContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.whiteBalanceMode" context:WhiteBalanceModeContext]; [self removeObserver:self forKeyPath:@"videoDeviceInput.device.deviceWhiteBalanceGains" context:DeviceWhiteBalanceGainsContext]; }Here's a crashlog:Last Exception Backtrace: 0 CoreFoundation 0x19df7bab0 __exceptionPreprocess + 224 1 libobjc.A.dylib 0x19dc95028 objc_exception_throw + 59 2 AVFoundation 0x1a82ed658 -[AVCaptureFigVideoDevice temperatureAndTintValuesForDeviceWhiteBalanceGains:] + 263 3 Reflex 0x1002b6b90 __65-[ViewController observeValueForKeyPath:ofObject:change:context:]_block_invoke + 633744 (ViewController.m:5483) 4 libdispatch.dylib 0x19dc20b7c _dispatch_call_block_and_release + 31 5 libdispatch.dylib 0x19dc21fd8 _dispatch_client_callout + 19 6 libdispatch.dylib 0x19dc2dcc8 _dispatch_main_queue_callback_4CF + 967 7 CoreFoundation 0x19def6e0c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 15 8 CoreFoundation 0x19def1b68 __CFRunLoopRun + 1979 9 CoreFoundation 0x19def1084 CFRunLoopRunSpecific + 479 10 GraphicsServices 0x1a813f534 GSEventRunModal + 107 11 UIKitCore 0x1a2061670 UIApplicationMain + 1939 12 App 0x100350e6c main + 1265260 (main.m:14) 13 libdyld.dylib 0x19dd70e18 start + 3 Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 libsystem_kernel.dylib 0x000000019dd66efc __pthread_kill + 8 1 libsystem_pthread.dylib 0x000000019dc86d10 pthread_kill + 196 2 libsystem_c.dylib 0x000000019dc16a74 abort + 104 3 libc++abi.dylib 0x000000019dd2e3c8 __cxa_bad_cast + 0 4 libc++abi.dylib 0x000000019dd2e5c0 demangling_unexpected_handler+ 5568 () + 0 5 libobjc.A.dylib 0x000000019dc95308 _objc_terminate+ 25352 () + 124 6 libc++abi.dylib 0x000000019dd3b634 std::__terminate(void (*)+ 58932 ()) + 20 7 libc++abi.dylib 0x000000019dd3b5c0 std::terminate+ 58816 () + 44 8 libdispatch.dylib 0x000000019dc21fec _dispatch_client_callout + 40 9 libdispatch.dylib 0x000000019dc2dcc8 _dispatch_main_queue_callback_4CF + 968 10 CoreFoundation 0x000000019def6e0c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16 11 CoreFoundation 0x000000019def1b68 __CFRunLoopRun + 1980 12 CoreFoundation 0x000000019def1084 CFRunLoopRunSpecific + 480 13 GraphicsServices 0x00000001a813f534 GSEventRunModal + 108 14 UIKitCore 0x00000001a2061670 UIApplicationMain + 1940 15 App 0x0000000100350e6c main + 1265260 (main.m:14) 16 libdyld.dylib 0x000000019dd70e18 start + 4
Posted
by RedSun.
Last updated
.
Post not yet marked as solved
0 Replies
664 Views
Hi,In my project I use siri shortcuts with "INUIAddVoiceShortcutButton".I use this method to create the button and associate the NSUserActivity: let button = INUIAddVoiceShortcutButton(style: .blackOutline) view.addSubview(button) button.shortcut = INShortcut.userActivity(activityPreCreated)In iOS 12 all works right, the button add or edit the shortcuts. But in iOS 13 the button only add but doesn't edit the shortcut and remain with the text "Add to Siri".Is a bug of the beta or I do something wrong?Thanks in advance
Posted
by RedSun.
Last updated
.