Posts

Post not yet marked as solved
1 Replies
419 Views
My app uses UICollectionView to display a list of documents. I previously implemented UICollectionViewDragDelegate to support drag-and-drop of the documents to open new windows / window scenes. This was working fine in iPadOS 15, but when I upgraded to iPadOS 16 beta 4 it stopped working. I can still drag items, but they don't open new windows (and the + icon never appears). How can I debug this? I can open new windows in my app using the Dock, so in general multiple windows still work in my app. The sample code for multiple windows, Gallery, works great so my iPad does support drag-and-drop multitasking. I can't figure out what is different between my app and Gallery... This is the drag-and-drop code, which is very similar to the Gallery sample app. Are there other places in the app I should look to debug, like other delegate methods?    func dragInteraction(_ interaction: UIDragInteraction, itemsForBeginning session: UIDragSession) -> [UIDragItem] {     let location = session.location(in: collectionView)     guard let indexPath = collectionView.indexPathForItem(at: location) else {       return []     }     let documentIndex = indexPath.row     let documentInfo = documentsToDisplay[documentIndex]     let itemProvider = NSItemProvider()     // I've confirmed this NSUserActivity.activityType is registered in Info.plist NSUserActivityTypes     itemProvider.registerObject(NSUserActivity.canvasUserActivity(documentId: documentInfo.documentId, accessLevel: documentInfo.accessLevel), visibility: .all)     let dragItem = UIDragItem(itemProvider: itemProvider)     dragItem.localObject = DocumentDragInfo(document: documentInfo, indexPath: indexPath)     return [dragItem]   }
Posted
by bridger.
Last updated
.
Post not yet marked as solved
1 Replies
1.6k Views
Is there a Machine Learning API that can take handwriting (either as a bitmap or as a list of points) and convert it to text? I know Scribble can be used to allow handwriting input into text fields, but in this API it is Scribble which controls the rendering of the handwriting. Is there an API where my app can render the handwriting and get information about the text content? In the Keynote demo Craig was able to get text content from a photo of a whiteboard. Are there APIs which would allow an app developer to create something similar?
Posted
by bridger.
Last updated
.
Post marked as solved
2 Replies
3.1k Views
I'm trying to support Scenes in my app. One library that I use has a UIWindow subclass for triggering debug gestures anywhere in the app. In my pre-Scenes app I had code like this to sometimes use that subclass:func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { ... let window: UIWindow if ScribbleTweaks.enabled { window = TweakWindow(frame: UIScreen.main.bounds, gestureType: .shake, tweakStore: ScribbleTweaks.defaultStore) } else { window = UIWindow(frame: UIScreen.main.bounds) } window.rootViewController = navController window.makeKeyAndVisible() ... }Is there any way to do something similar with a UIWindowSceneDelegate? I can't find any API for configuring the UIWindow class.As a follow-up, is it possible to use UIWindowScene without storyboards? Currently I don't use storyboards and it would be nice to port over the same code for configuring the initial view controllers.
Posted
by bridger.
Last updated
.
Post not yet marked as solved
0 Replies
497 Views
Our app uses the iAd framework to measure whether people are installing our app using Apple Search Ads. Is there documentation from Apple which explains how that attribution is done? What data does it collect? I'd like to use this information to complete the Privacy Nutrition Label.
Posted
by bridger.
Last updated
.
Post not yet marked as solved
0 Replies
658 Views
Adding an App Clip has made build-and-run significantly slower, because it builds the App Clip even when I'm just working on the main app. Is there a way to configure Xcode so it doesn't build the App Clip except when using 'Archive'? I tried this: Remove the App Clip from the list of dependencies in my app target In the 'Embed App Clips' stage of my app Target I selected 'Copy only when installing' Unfortunately, this didn't work. The App Clip is still built alongside the main app each time I run on the simulator or device.
Posted
by bridger.
Last updated
.
Post not yet marked as solved
3 Replies
4.1k Views
The "Legal" label/button that shows in the bottom-left of the MKMapView is occluded by other content in my app. Previously I was able to move that label above my content by using the layoutMargins property. That doesn't work in iOS 11. I've also tried directionalLayoutMargins and additionalSafeAreaInsets with no luck. Is there a way to do this in iOS 11?Here is the code which I've tried:let occludedHeight = self.view.bounds.height - bottomCoveringView.frame.minY let labelOffset = occludedHeight + 10 if #available(iOS 11, *) { mapView.directionalLayoutMargins = NSDirectionalEdgeInsets(top: 0, leading: 0, bottom: labelOffset, trailing: 0) self.view.directionalLayoutMargins = NSDirectionalEdgeInsets(top: 0, leading: 0, bottom: labelOffset, trailing: 0) self.additionalSafeAreaInsets = UIEdgeInsets(top: 0, left: 0, bottom: labelOffset, right: 0) mapView.preservesSuperviewLayoutMargins = true } else { mapView.layoutMargins = UIEdgeInsets(top: 0, left: 0, bottom: labelOffset, right: 0) }
Posted
by bridger.
Last updated
.
Post not yet marked as solved
0 Replies
1.9k Views
I'm trying to use UIImage's new withTintColor method to update images for dark mode. So far, it only works when using UIButton but not UIImageView nor CALayer.As far as I can tell, accessing the cgImage never returns a tinted image. I also can't find an API on UIImage to see if it was created with a tint color. Is there an API I'm missing?This is what I've tried so far: let dynamicColor = UIColor(dynamicProvider: { (traitCollection) -> UIColor in if traitCollection.userInterfaceStyle == .dark { return UIColor.white } else { return UIColor.black } }) let pinkImage = #imageLiteral(resourceName: "PinkImage") let hopefullyDynamicImage = pinkImage.withTintColor(dynamicColor, renderingMode: .alwaysOriginal) // This UIButton properly shows a black or white image depending on dark mode. It automatically updates someButton.setImage(hopefullyDynamicImage, for: .normal) // This UIImageView shows either white or black image, but it doesn't update when the dark mode is switched on or off. It is stuck in time someImageView.image = hopefullyDynamicImage // This CALayer shows up as pink. The tint had no effect someUIView.layer.contents = hopefullyDynamicImage.cgImageI've also tried using UIImageConfiguration like this, but it still has no tint in the cgImage: override func traitCollectionDidChange(_ previousTraitCollection: UITraitCollection?) { // This layer still shows no tint color. It is the original pink someUIView.layer.contents = hopefullyDynamicImage?.withConfiguration(self.traitCollection.imageConfiguration).cgImage }
Posted
by bridger.
Last updated
.
Post not yet marked as solved
1 Replies
1.6k Views
I'm trying to figure out the correct way to close the current UISceneSession. The requestSceneSessionDestruction API on UIApplication requires a UISceneSession object which has been tricky to get. The close API also might return an NSError when closing the session, but I don't know what might cause that error and how an application should handle it.For getting the current UISceneSession I could pass it along into every viewController and view which might need it, but that is cumbersome. This is what I've got instead. It seems odd and fragile. How can I make it more idiomatic?extension UIApplication { @discardableResult func closeSceneFor(view: UIView) -> Bool { if #available(iOS 13.0, *) { if let window = view.window, let sceneSession = UIApplication.shared.sceneSessionFor(window: window) { let options = UIWindowSceneDestructionRequestOptions() options.windowDismissalAnimation = .standard // TODO: What might cause an error and how should it be handled? requestSceneSessionDestruction(sceneSession, options: options, errorHandler: nil) return true } } return false } @available(iOS 13.0, *) func sceneSessionFor(window: UIWindow) -> UISceneSession? { for sceneSession in self.openSessions { if let windowScene = sceneSession.scene as? UIWindowScene { for sceneWindow in windowScene.windows { if sceneWindow == window { return sceneSession } } } } return nil } }
Posted
by bridger.
Last updated
.