Post

Replies

Boosts

Views

Activity

Severe lag adding PKStroke to PKDrawing
We've been using PencilKit in our app for a couple of years now, but have run into a significant slowdown recently. It coincided with a shift to iPadOS 17, but possibly that's a coincidence. It seems to occur when we add PKStroke elements to a PKDrawing (programatically). Previously this has refreshed on-screen instantly, but now it's so slow we can sometime see the new stroke progressively drawing on-screen. I profiled the running app and the lag seems to entirely occur within the following call sequence: 2563 -[PKDrawingConcrete addNewStroke:] 2494 -[PKDrawing setNeedsRecognitionUpdate] 2434 -[NSUserDefaults(NSUserDefaults) boolForKey:] 2419 _CFPreferencesGetAppBooleanValueWithContainer 2417 _CFPreferencesCopyAppValueWithContainerAndConfiguration 2399 -[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:] 2395 -[_CFXPreferences withSearchListForIdentifier:container:cloudConfigurationURL:perform:] 2394 normalizeQuintuplet 2367 __108-[_CFXPreferences(SearchListAdditions) withSearchListForIdentifier:container:cloudConfigurationURL:perform:]_block_invoke 2338 __76-[_CFXPreferences copyAppValueForKey:identifier:container:configurationURL:]_block_invoke 2336 -[CFPrefsSource copyValueForKey:] 2332 -[CFPrefsSearchListSource alreadylocked_copyValueForKey:] 1924 -[CFPrefsSearchListSource alreadylocked_copyValueForKey:].cold.1 1921 _os_log_debug_impl 1918 _os_log 1914 _os_log_impl_flatten_and_send 1829 _os_log_impl_stream 1811 _os_activity_stream_reflect 1205 dispatch_block_perform 1199 _dispatch_block_invoke_direct 1195 _dispatch_client_callout 1194 ___os_activity_stream_reflect_block_invoke 1188 _xpc_pipe_simpleroutine 882 _xpc_send_serializer 867 _xpc_pipe_mach_msg 859 mach_msg 859 mach_msg_overwrite 855 mach_msg2_internal 853 0x1cf7701d8 Up to 'addNewStroke' this makes sense. But where is it going with 'PKDrawing setNeedsRecognitionUpdate'? This ultimately seems to be hitting some kind of lock, which I assume is the cause of the lag. Any suggestions for why this is happening or a means of avoiding it would be much appreciated.
0
1
546
Oct ’23
Mixed-language package - how to import Swift header into Objective-C
I'm working on a package that requires mixed language functionality (an Objective-C function calls a function in a Swift class). I've divided the language files into two targets - one for Swift and the other for Objective-C. The class called from Objective-C is declared with @objc and I can see the correct header is emitted for this class. I attempt to import this file into the Objective-C code as noted in this article However, when I attempt to import the emitted header into the Objective-C file, Xcode keeps saying this file can't be found. I assume this is because it's in another target in the same package. I can manually copy the emitted header directly into the package directory and then everything works correctly, but this isn't a sensible long-term solution. I assume there's something I need to do to make Xcode look in the correct directory for the emitted header - any suggestions/solutions?
1
0
1.7k
Nov ’22
Laggy PencilKit stroke response
We're noticing some odd behaviour using PencilKit in our app. Most of the time, the response it quite fluid. But at intervals (and depending on what kind of strokes we create), PencilKit freezes the whole app for up to several seconds. It will come to life again and continue as normal, but eventually repeat that behaviour. When it happens, it's usually accompanied by one or two lines of console output, e.g.: 2023-05-12 15:52:29.101996+0100 Spaces[51229:3635610] [] Did not have live interaction lock at end of stroke 2023-05-12 15:52:29.556467+0100 Spaces[51229:3630745] [] Drawing did change that is not in text. I can't find any reference to these messages. They may have no bearing on the observed problem, but it might be a clue. Has anyone seen this behaviour or have any information about their meaning?
1
1
916
May ’23
Best substitute for ARGeoTracking
I've volunteered to help out on a community project to create an AR presentation of a local historic building (long gone, but foundations remain). Ideally it would be available as a free app for visitors so they can see the site as it was in the past. The idea is that you can walk around and through the site seeing the building as it was through a phone/tablet. As such, it needs to be be able to accurately place the model on a specific geographic location and track the user's position etc. It looked like ARKit already had the perfect geolocating solution (as described here: https://developer.apple.com/documentation/arkit/content_anchors/tracking_geographic_locations_in_ar), but unfortunately Apple hasn't photographed this area yet (so no site data can be downloaded from Apple's servers). I haven't seen any indication that this data can be supplied in any other way, so I'm assuming that solution is a non-starter - if anyone has an answer, please let me know. Otherwise I'm looking for an alternative. Other parts of ARKit seem to rely on the user placing a model in context (rather than using a hard-wired location), but again I'd be happy to hear otherwise. I've also been looking at a solution from ArcGIS (https://developers.arcgis.com/swift/scenes-3d/display-scenes-in-augmented-reality/), specifically the 'World-scale' application. If anyone has any experience with this, I'd appreciate any tips you may have. Any other suggested solutions would be much appreciated.
0
1
520
Apr ’23