I'd like to receive files in my app, but only certain UIScenes can receive a file. How do I tell the system which of the UIScenes in my app can receive that file? How does the system pick a scene?
I've tried using UISceneActivationConditions(). Here is what I put in the scenes that can receive files:
var preferredActivatePredicates = [NSPredicate]()
var canActivatePredicates = [NSPredicate]()
// Any scene *can* be activated at any time
canActivatePredicates.append(NSPredicate(value: true))
if hasWriteAccess {
// We are also the preferrred scene for receiving files
for fileExtension in ["png", "jpeg", "jpg", "pdf"] {
preferredActivatePredicates.append(NSPredicate(format: "SELF ENDSWITH[c] '.%@'", fileExtension))
}
}
let conditions = UISceneActivationConditions()
conditions.canActivateForTargetContentIdentifierPredicate = NSCompoundPredicate(orPredicateWithSubpredicates: canActivatePredicates)
conditions.prefersToActivateForTargetContentIdentifierPredicate = NSCompoundPredicate(orPredicateWithSubpredicates: preferredActivatePredicates)
scene.activationConditions = conditions
Unfortunately, this does not seem to have an effect. It does not activate the hasWriteAccess scenes for func scene(_ scene: UIScene, openURLContexts URLContexts: Set<UIOpenURLContext>)
Is there any way to tell what content identifier the system is using when a file share activates an app?
Thank you for your help!
Post
Replies
Boosts
Views
Activity
My app uses UICollectionView to display a list of documents. I previously implemented UICollectionViewDragDelegate to support drag-and-drop of the documents to open new windows / window scenes. This was working fine in iPadOS 15, but when I upgraded to iPadOS 16 beta 4 it stopped working. I can still drag items, but they don't open new windows (and the + icon never appears). How can I debug this?
I can open new windows in my app using the Dock, so in general multiple windows still work in my app. The sample code for multiple windows, Gallery, works great so my iPad does support drag-and-drop multitasking. I can't figure out what is different between my app and Gallery...
This is the drag-and-drop code, which is very similar to the Gallery sample app. Are there other places in the app I should look to debug, like other delegate methods?
func dragInteraction(_ interaction: UIDragInteraction, itemsForBeginning session: UIDragSession) -> [UIDragItem] {
let location = session.location(in: collectionView)
guard let indexPath = collectionView.indexPathForItem(at: location) else {
return []
}
let documentIndex = indexPath.row
let documentInfo = documentsToDisplay[documentIndex]
let itemProvider = NSItemProvider()
// I've confirmed this NSUserActivity.activityType is registered in Info.plist NSUserActivityTypes
itemProvider.registerObject(NSUserActivity.canvasUserActivity(documentId: documentInfo.documentId, accessLevel: documentInfo.accessLevel), visibility: .all)
let dragItem = UIDragItem(itemProvider: itemProvider)
dragItem.localObject = DocumentDragInfo(document: documentInfo, indexPath: indexPath)
return [dragItem]
}
Is there a Machine Learning API that can take handwriting (either as a bitmap or as a list of points) and convert it to text?
I know Scribble can be used to allow handwriting input into text fields, but in this API it is Scribble which controls the rendering of the handwriting. Is there an API where my app can render the handwriting and get information about the text content?
In the Keynote demo Craig was able to get text content from a photo of a whiteboard. Are there APIs which would allow an app developer to create something similar?
Our app uses the iAd framework to measure whether people are installing our app using Apple Search Ads.
Is there documentation from Apple which explains how that attribution is done? What data does it collect?
I'd like to use this information to complete the Privacy Nutrition Label.
Adding an App Clip has made build-and-run significantly slower, because it builds the App Clip even when I'm just working on the main app.
Is there a way to configure Xcode so it doesn't build the App Clip except when using 'Archive'?
I tried this: Remove the App Clip from the list of dependencies in my app target
In the 'Embed App Clips' stage of my app Target I selected 'Copy only when installing'
Unfortunately, this didn't work. The App Clip is still built alongside the main app each time I run on the simulator or device.
I'm trying to support Scenes in my app. One library that I use has a UIWindow subclass for triggering debug gestures anywhere in the app. In my pre-Scenes app I had code like this to sometimes use that subclass:func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
...
let window: UIWindow
if ScribbleTweaks.enabled {
window = TweakWindow(frame: UIScreen.main.bounds, gestureType: .shake, tweakStore: ScribbleTweaks.defaultStore)
} else {
window = UIWindow(frame: UIScreen.main.bounds)
}
window.rootViewController = navController
window.makeKeyAndVisible()
...
}Is there any way to do something similar with a UIWindowSceneDelegate? I can't find any API for configuring the UIWindow class.As a follow-up, is it possible to use UIWindowScene without storyboards? Currently I don't use storyboards and it would be nice to port over the same code for configuring the initial view controllers.
The "Legal" label/button that shows in the bottom-left of the MKMapView is occluded by other content in my app. Previously I was able to move that label above my content by using the layoutMargins property. That doesn't work in iOS 11. I've also tried directionalLayoutMargins and additionalSafeAreaInsets with no luck. Is there a way to do this in iOS 11?Here is the code which I've tried:let occludedHeight = self.view.bounds.height - bottomCoveringView.frame.minY
let labelOffset = occludedHeight + 10
if #available(iOS 11, *) {
mapView.directionalLayoutMargins = NSDirectionalEdgeInsets(top: 0, leading: 0, bottom: labelOffset, trailing: 0)
self.view.directionalLayoutMargins = NSDirectionalEdgeInsets(top: 0, leading: 0, bottom: labelOffset, trailing: 0)
self.additionalSafeAreaInsets = UIEdgeInsets(top: 0, left: 0, bottom: labelOffset, right: 0)
mapView.preservesSuperviewLayoutMargins = true
} else {
mapView.layoutMargins = UIEdgeInsets(top: 0, left: 0, bottom: labelOffset, right: 0)
}