Posts

Post not yet marked as solved
0 Replies
477 Views
I have an older app that is a mix of Swift & Objective-C. I have 2 groups of storyboards for the iPhone and the iPad using storyboard references. There seems to be a bug, when using the Simulator, it is loading the storyboard specified by the key "Main storyboard file base name" and not using the key "Main storyboard file base name (iPad)". I did change the first key to use the iPad storyboard & it then worked as expected in the visionOS simulator. The raw keys are: UIMainStoryboardFile UIMainStoryboardFile~ipad What should I do?
Posted
by Miro.
Last updated
.
Post not yet marked as solved
2 Replies
1.3k Views
I working on of proof of concept app to load some 3D shapes & display them on a Apple Watch using SceneKit. I am using an Apple Watch series one running Watch OS 4 Beta 3 & get the following error ar run-time:failed to unarchive scene at file: (omitted) food.scnassets/burger.scn (*** -[SCNKeyedUnarchiver decodeInt32ForKey:]: value (9223372036854775807) for key (primitiveRangeLocation) too large to fit in 32-bit integer)This also happens in the simualtor. This does not happen when I use Xcode 8.3.3 and a watch simlauor for Watch OS 3.2 with same project. And I can also load the SCN file on an iPhone running the iOS 11 Beta Anyone else having this problem? I should add that I am creating the SCN file by converting a DAE file within Xcode.
Posted
by Miro.
Last updated
.
Post marked as solved
1 Replies
821 Views
I have been reading up on creating metal filters & how they can be used with SCNTechnique to apply a post processing effect with AR Kit. I can get the basic effects working following other examples. I would like to use filters that were made by FlexMonkey some years ago; I see they are C programs in a way. E.g: https://github.com/FlexMonkey/Filterpedia/blob/master/Filterpedia/customFilters/TransverseChromaticAberration.swift https://github.com/FlexMonkey/Filterpedia Is there way a to adapt such filter to be used with a metal shader? I am quite willing to watch any relevant WWDC videos to learn how to do this. Ideally, I want to apply a chromatic aberration effect that will shift the RGB color position using a time uniform. For reference: https://en.wikipedia.org/wiki/Chromatic_aberration
Posted
by Miro.
Last updated
.
Post marked as solved
1 Replies
666 Views
I started making a project in reality composer in MacOS Catalina beta 8 that does not seem to be available on iOS 13 Test Flight build v 1.0(104.2) .It is the record player in the art/music category.Might there be some way to include the model resource in the project such that it will appear in iIOS?
Posted
by Miro.
Last updated
.
Post not yet marked as solved
2 Replies
794 Views
I am having no end of problems with Catalina beta 8; they seem to be moslty finder related. I cannot even use the feedback reporter to report them. It is an "Application not repsonding" state.I was trying to a move a file from my desktop folder to another folder on the desktop. It is stuck displaying a "Preparing to move..." message in the file move dialogue box. The file move has slowed Finder right down that is almost unsable. I have relaunch finder to be able to do anything else that requires finder.
Posted
by Miro.
Last updated
.
Post not yet marked as solved
3 Replies
1.6k Views
We are working on a MacOS app that was based on the face detection iOS sample code from Apple. We are experincing a frequent crash on Mac OS Mojave (version 10.14.6) but it is not happening on Catalina (beta 7) so far:Crashed Thread: 14 Dispatch queue: com.apple.root.default-qos Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000111013000 Exception Note: EXC_CORPSE_NOTIFY Termination Signal: Segmentation fault: 11 Termination Reason: Namespace SIGNAL, Code 0xb Terminating Process: exc handler [3453]Thread 14 Crashed:: Dispatch queue: com.apple.root.default-qos 0 com.apple.vImage 0x00007fff28ab4f01 vConvert_ARGB8888To420Yp8_CbCr8_avx2 + 1489 1 com.apple.vImage 0x00007fff28927aee Do_vImageConvert_ARGB8888To420Yp8_CbCr8 + 174 2 libdispatch.dylib 0x00007fff58117672 _dispatch_client_callout2 + 8 3 libdispatch.dylib 0x00007fff58126f95 _dispatch_apply_invoke + 157 4 libdispatch.dylib 0x00007fff5811763d _dispatch_client_callout + 8 5 libdispatch.dylib 0x00007fff58125509 _dispatch_root_queue_drain + 657 6 libdispatch.dylib 0x00007fff58125b46 _dispatch_worker_thread2 + 90 7 libsystem_pthread.dylib 0x00007fff583576b3 _pthread_wqthread + 583 8 libsystem_pthread.dylib 0x00007fff583573fd start_wqthread + 13And another crash is happening in Core Animaiton after about 2 hours of running the app in both operating systems:Process: Face Detector [12828] Path: /Users/USER/*/Face Detector.app/Contents/MacOS/Face Detector Identifier: Face Detector Version: 0.6 (65) Code Type: X86-64 (Native) Parent Process: ??? [1] Responsible: Face Detector [12828] User ID: 501 Date/Time: 2019-09-03 11:34:05.305 -0700 OS Version: Mac OS X 10.15 (19A546d) Report Version: 12 Bridge OS Version: 3.0 (14Y905) Anonymous UUID: 0C8321C5-745D-8A3A-1BF1-710CB74E8A04 Time Awake Since Boot: 7200 secondsThread 6 Crashed:: com.apple.coremedia.imagequeue.coreanimation.common 0 libobjc.A.dylib 0x00007fff6f44914c realizeClassWithoutSwift(objc_class*, objc_class*) + 168 1 libobjc.A.dylib 0x00007fff6f448fdc realizeClassMaybeSwiftMaybeRelock(objc_class*, mutex_tt&, bool) + 301 2 libobjc.A.dylib 0x00007fff6f43ab1e lookUpImpOrForward + 715 3 libobjc.A.dylib 0x00007fff6f43a3d9 _objc_msgSend_uncached + 73 4 com.apple.CoreFoundation 0x00007fff39804163 __NSArrayM_new + 46 5 com.apple.MediaToolbox 0x00007fff3e67d05c 0x7fff3e380000 + 3133532 6 com.apple.CoreMedia 0x00007fff3a91db48 figThreadMain + 276 7 libsystem_pthread.dylib 0x00007fff709aed76 _pthread_start + 125 8 libsystem_pthread.dylib 0x00007fff709ab5d3 thread_start + 15The first crash is more critcal to resolve. Is this crash indicating that there is an underlying problem in Vision framework in Mojave? Or is there something do with how background queues are being used?Some additional context, the applicaiton is converting the video preview stream to images that get uploaded to Firebase when a face has been detected. The CALayers for the tracking overlays are being composited into a new image with the video stream. I don't think that is related to the crash.
Posted
by Miro.
Last updated
.