Posts

Post marked as solved
2 Replies
560 Views
I'm building an older app that's on the app store in XCode 15.2. It runs fine on iOS 15, 16, 17. It is supported on iOS 12 and when I try to run on iOS 12 it crashes. I'm not sure why this is happening. Anyone know? Crash report: Photobooth-2024-01-11-181720.ips Snippet: Exception Type: EXC_CRASH (SIGKILL) Exception Codes: 0x0000000000000000, 0x0000000000000000 Exception Note: EXC_CORPSE_NOTIFY Termination Description: DYLD, Symbol not found: _$s7Network12NWConnectionC5StateOMa | Referenced from: /private/var/containers/Bundle/Application/C5B7AB67-6F8C-4EE3-977C-2076C4F06729/Photobooth.app/Frameworks/VideoNetworkFramework.framework/VideoNetworkFramework | Expected in: /System/Library/Frameworks/Network.framework/Network Triggered by Thread: 0
Posted Last updated
.
Post not yet marked as solved
109 Replies
44k Views
I'm unable to use XCode 14 to develop my app because as soon as a compiler error is shown, it is immediately withdrawn from the Issue Navigator pane. This makes it impossible to see what is wrong. The errors are also immediately withdrawn from the editor. This video shows 3 attempts to compile the project: after the first attempt no error shows, after the second attempt the error shows for a moment then is automatically removed (no mouse or keypress from me), after the third attempt same. https://youtu.be/bmK_k6oLYpQ I have tried rebooting, and deleting ~/Library/Developer/Xcode/DerivedData, to no effect.
Posted Last updated
.
Post not yet marked as solved
4 Replies
2.2k Views
I'm using AVAssetReaders with AVSampleBufferDisplayLayers to display multiple videos at once. I'm seeing this issue on iOS 13.1.3, 13.2b2, on various hardware like iPad 10.5 and iPad 12.9.It works well for a while, then a random call to copyNextSampleBuffer never returns, blocking that thread indefinitely and eating up resources.I have tried different threading approaches with no avail:If copyNextSampleBuffer() and reader.cancelReading() are done on the same queue, then copyNextSampleBuffer() gets stuck and the cancelReading() never gets processed because the queue is blocked. If I manually (with the debugger) jump in on that blocked queue and execute cancelReading(), immediately an EXC_BREAKPOINT crashes the appIf copyNextSampleBuffer() and reader.cancelReading() are done on different queues, then copyNextSampleBuffer() crashes with EXC_BAD_ACCESSHere's the stacktrace (same queue approach). I don't understand why it's stuck, my expectation is that copyNextSampleBuffer should always return (ie. with nil in error case).VideoPlayerView: UIView with AVSampleBufferDisplayLayerAVAssetFactory: Singleton with the queue that creates & manages all AVAssetReader / AVAsset* objects* thread #22, queue = 'AVAssetFactory' frame #0: 0x00000001852355f4 libsystem_kernel.dylib`mach_msg_trap + 8 frame #1: 0x0000000185234a60 libsystem_kernel.dylib`mach_msg + 72 frame #2: 0x00000001853dc068 CoreFoundation`__CFRunLoopServiceMachPort + 216 frame #3: 0x00000001853d7188 CoreFoundation`__CFRunLoopRun + 1444 frame #4: 0x00000001853d68bc CoreFoundation`CFRunLoopRunSpecific + 464 frame #5: 0x000000018f42b6ac AVFoundation`-[AVRunLoopCondition _waitInMode:untilDate:] + 400 frame #6: 0x000000018f38f1dc AVFoundation`-[AVAssetReaderOutput copyNextSampleBuffer] + 148 frame #7: 0x000000018f3900f0 AVFoundation`-[AVAssetReaderTrackOutput copyNextSampleBuffer] + 72 * frame #8: 0x0000000103309d98 Photobooth`closure #1 in AVAssetFactory.nextSampleBuffer(reader=0x00000002814016f0, retval=(Swift.Optional<CoreMedia.CMSampleBuffer>, Swift.Optional<AVFoundation.AVAssetReader.Status>) @ 0x000000016dbd1cb8) at AVAssetFactory.swift:108:34 frame #9: 0x0000000102f4f480 Photobooth`thunk for @callee_guaranteed () -> () at <compiler-generated>:0 frame #10: 0x0000000102f4f4a4 Photobooth`thunk for @escaping @callee_guaranteed () -> () at <compiler-generated>:0 frame #11: 0x000000010bfe6c04 libdispatch.dylib`_dispatch_client_callout + 16 frame #12: 0x000000010bff5888 libdispatch.dylib`_dispatch_lane_barrier_sync_invoke_and_complete + 124 frame #13: 0x0000000103309a5c Photobooth`AVAssetFactory.nextSampleBuffer(reader=0x00000002814016f0, self=0x0000000281984f60) at AVAssetFactory.swift:101:20 frame #14: 0x00000001032ab690 Photobooth`closure #1 in VideoPlayerView.setRequestMediaLoop(self=0x000000014b8da1d0, handledCompletion=false) at VideoPlayerView.swift:254:70 frame #15: 0x0000000102dce978 Photobooth`thunk for @escaping @callee_guaranteed () -> () at <compiler-generated>:0 frame #16: 0x000000018f416848 AVFoundation`-[AVMediaDataRequester _requestMediaDataIfReady] + 80 frame #17: 0x000000010bfe5828 libdispatch.dylib`_dispatch_call_block_and_release + 24 frame #18: 0x000000010bfe6c04 libdispatch.dylib`_dispatch_client_callout + 16 frame #19: 0x000000010bfedb74 libdispatch.dylib`_dispatch_lane_serial_drain + 744 frame #20: 0x000000010bfee744 libdispatch.dylib`_dispatch_lane_invoke + 500 frame #21: 0x000000010bff9ae4 libdispatch.dylib`_dispatch_workloop_worker_thread + 1324 frame #22: 0x000000018517bfa4 libsystem_pthread.dylib`_pthread_wqthread + 276I've tried all kinds of other things like making sure the AVAssets and all objects are made on one queue, and stopping the AVAssetReaders from ever deallocing to see if that helps. Nothing works. Any ideas?
Posted Last updated
.
Post not yet marked as solved
2 Replies
875 Views
Hi Apple, Can you please comment whether this is intentional? Here is the official Webshare W3C proposed recommendation: https://www.w3.org/TR/web-share/ Here is the test-suite linked from the recommendation that repros this issue. Run this on an iPhone with iOS 16 (16.4.1 in my case). Note: I care about the image/jpeg mime type which also does not work; this test case covers svg+xml. https://wpt.live/web-share/share-image-manual.tentative.https.html Steps from this page: hit share button, share sheet pops up Expected: "Save to photos" is present Actual: it isn't present. Only "Save to files" which does not get the image to the camera roll which is the user's intended destination for a photo. I tried this on iOS 15.7 and it works well. Note that i'm using image/jpeg in my tests not svg+xml as seen in the test harness. I tried this with a video on iOS 16 and it works. It appears that either images or image/jpeg is having the issue here. I'd appreciate knowing if this is a bug, or intentional behavior, and if there is a workaround.
Posted Last updated
.
Post not yet marked as solved
2 Replies
903 Views
Hi Apple, It is frustrating that apps have no selection over portraitEffect, because it creates the following problem: in some modes of our app, user wants portrait mode, so they turn it on for the app in other modes of our app, 60-120 FPS is required for what the user is doing - user does NOT want portrait mode in this situation switching between these modes needs to be fast & done in-app only, no popping control center open (think kiosk-style applications at airports) ... but as soon as portrait mode is on, all formats are delivering only 30 FPS. I have found no way to get 60 FPS: manually setting frame duration doesn't work, and on devices like the iPad 11" all 60 FPS formats support portrait effect so I cannot just select an activeFormat that doesn't support it. How can our app offer both portrait mode and 60 FPS mode in different settings, without the user having to make confusing extra settings changes outside of our app?
Posted Last updated
.
Post marked as solved
1 Replies
613 Views
I have a UIViewController with the following code. I have tried isPortraitEffectEnabled and portraitEffectEnabled, both have the same result: observeValue() is never called. What am I missing? To test this I am toggling the value of portaitEffectEnabled by calling AVCaptureDevice.showSystemUserInterface(.videoEffects) and turning it on/off, and expecting the KVO to fire. @objc class EventSettingsCaptureViewController : UIViewController, ... {     required init(...) {         super.init(nibName: nil, bundle: nil)         if #available(iOS 15.0, *) {             AVCaptureDevice.self.addObserver(self, forKeyPath: "portraitEffectEnabled", options: [.new], context: nil)         }     }     deinit {         if #available(iOS 15.0, *) {             AVCaptureDevice.self.removeObserver(self, forKeyPath: "portraitEffectEnabled", context: nil)         }     }     override func observeValue(forKeyPath keyPath: String?, of object: Any?, change: [NSKeyValueChangeKey : Any]?, context: UnsafeMutableRawPointer?) { // Breakpoint set here: never hits         if keyPath == "portraitEffectEnabled" {             guard let object = object as? AVCaptureDevice.Type else { return }             if #available(iOS 15.0, *) {                 WLog("isPortraitEffectEnabled changed: \(object.isPortraitEffectEnabled)")             }         } else {             super.observeValue(forKeyPath: keyPath, of: object, change: change, context: context)         }     }
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
I'm noticing that the number shown on the Apple App Store Connect -> Sales and Trends page in the Proceeds section for the month of January is significantly smaller (10-20%) than the number shown in Payments and Financial Reports for January. The only thing I can think of is that revenue from subscription renewals is not included when looking at the Proceeds section.Does anyone know if this is true? Or is there another explanation?
Posted Last updated
.
Post not yet marked as solved
0 Replies
545 Views
   DNSServiceFlags flags = includeP2P ? kDNSServiceFlagsIncludeP2P : 0;   DNSServiceErrorType err = DNSServiceRegister(&registerRef, flags, interfaceIndex, name, type, domain, NULL,                     bigEndianPort, txtLen, txtData, registerServiceCallBack, (__bridge void *)([self setCurrentCallbackContextWithSelf])); On iOS 14 when building with XCode 12, the call to to DNSServiceRegister is failing with error -65555. When running app store version built with XCode 11, this error doesn't occur. We use this for device-to-device communication over Bluetooth when the wifi is directly connected to a device without internet (such as a digital camera). The app has "Always Allow" Bluetooth permission per CBManager.authorization. Any ideas?
Posted Last updated
.
Post not yet marked as solved
1 Replies
648 Views
When I double-click an editor placeholder of a block in XCode, it helpfully fills it out for me. Problem is, sometimes I forget to go back and add [weak self] to the start of the block. Now that XCode 12 is adding developer-friendly features, I'd like to see this get added - perhaps as a preference - that when I double-click to expand a placeholder that represents a block, it can add [weak self] for me. Reasoning: it's instantly apparent if I use [weak self] when I should not, and it is an easy to find problem to fix. But it is really hard to figure out retain cycles and fix the opposite -- where i've accidentally created a strong reference that should have been weak.
Posted Last updated
.
Post not yet marked as solved
0 Replies
682 Views
I'm trying to get Deep Fusion working in my camera app on iOS 13.2b1.I can't figure out why isVirtualDeviceFusionSupported is always false on an iPhone 11 Pro, no matter which AVCaptureDevice I use, or other settings like photoQualityPrioritization (ie. using .quality).Has anyone figured out what configuration in AVFoundation yields isVirtualDeviceFusionSupported = true? Which AVCaptureDevice did you use?
Posted Last updated
.
Post not yet marked as solved
0 Replies
686 Views
If I profile my app with Instruments on my iPad Air (iOS 12.4), choose the Leaks instrument or the Allocations instrument, run it for a while, my app is killed due to memory pressure.I'm trying to find my bug (it's possibly something to do with looped playback of videos), but i'm stymied: "All Heap & Anonymous VM" does not grow. It stays nice and low (about 110MB). Since the app keeps getting killed for memory pressure, I suppose i'm causing a leak outside of my app somehow (and i'm assuming this is my bug as i'm not doing anything unusual). Where do I go from here? Can someone spot my faulty assumption / misunderstanding?thanks.
Posted Last updated
.
Post not yet marked as solved
1 Replies
783 Views
Hi Apple / Quinn,Our app uses CNCopyCurrentNetworkInfo to show the wifi SSID to the user, so they can determine "Yup that's the wifi SSID of my camera, that's correct" or "oh, hey, I forgot to connect the wifi to the camera, no wonder it isn't working".I understand CNCopyCurrentNetworkInfo is going to be changed in iOS 13 -- there was an email received today about this. I also understand that if my app requests location information from the user while using the app, and they accept, then CNCopyCurrentNetworkInfo will continue to function as expected in iOS 13.Question: is it sufficient to call CLLocationManager's requestWhenInUseAuthorization to make CNCopyCurrentNetworkInfo work? or is requestAlwaysAuthorization required? thanks,-tim
Posted Last updated
.