Post

Replies

Boosts

Views

Activity

UITableView: Adjust destination index path when dragging and dropping from another view
When reordering items within a table view, I can use tableView:targetIndexPathForMoveFromRowAtIndexPath: to adjust the destination index path. Is there a way to adjust the destination index path when dropping onto a table view if the drag was initiated in a different view? For example, if I'm dropping an object into a table view where the rows are sorted alphabetically, I want the gap to appear in the location where the item is going to end up, regardless of which row the dragged item is being dragged / dropped at.
1
0
234
Aug ’24
Regression: Autocorrect suggestions sometimes eat touch events
If Autocorrect is showing a potential correction, sometimes tapping elsewhere in the UI causes the suggestion to be accepted, but the touch event is not passed to the tapped view. A screen recording from a sample app demonstrating the issue is available here. I first type "Coffee", which does not show an autocorrect suggestion and then I tap on a table row, which successfully shows an alert. I then type "Coffeesc", which shows an autocorrect suggestion. Then when I tap on one of the table rows, the autocorrect suggestion is accepted, but our implementation of tableView:didSelectRowAtIndexPath: is not called and no alert is shown. The issue began in iOS 17.1 and only occurred when the Predictive Text switch in Settings -> General -> Keyboard was OFF. However, the issue still occurs in iOS 18 beta 3 and now I'm able to reproduce the issue when the Predictive Text switch is ON. I filed FB13418977 about the issue back in November (including the sample project shown in the video that reproduces the issue), but there have been no comments from Apple in the bug report. I initiated a code-level support request in April, but was told DTS did not have a workaround and that I should just continue to follow the bug report for updates. This bug impacts the primary text field in our app, but since Predictive Text is enabled by default, it hasn't impacted a large number of our customers on iOS 17. And for users that were impacted, we were able to tell them to enable Predictive Text as a workaround until the bug is fixed. With the issue occurring even when Predictive Text is ON in the iOS 18 betas, the text field is now nearly impossible to use. The only workaround I can think of is turning off autocorrect for that text field, but that isn’t a great solution, as it leads to a significant increase in typos when the user types multiple words. We would desperately like for this to be fixed in iOS 18, but if someone at Apple could please look at FB13418977 and at least provide us with an update or potential workaround, it would be greatly appreciated. Thank you and please let me know if you need any additional information.
1
0
312
Jul ’24
Cannot install watch app on watches running watchOS 7 & 8 after adding WidgetKit extension
In our most recent app update, we added a WidgetKit extension to our watch app. The WidgetKit extension has a minimum deployment target of watchOS 9, but the watch app has a minimum deployment target of watchOS 7. After releasing this update, we discovered customers running watchOS 7 & 8 can no longer install the app on their watch. When tapping the Install button in the Watch app on the phone, it spins for a while and then stops and returns to saying Install. No error message is displayed. If I build and run the iOS app through Xcode, I'm able to install the app on a watch running watchOS 8.8.1. However, I'm unable to install the app on the watch from TestFlight or the App Store. Has anyone else run into this or have any thoughts on what project settings we should look at to fix the issue?
4
1
909
Oct ’23
iOS 16.4: Siri Shortcut works within Shortcuts app, but not when activated by voice
My app has a Siri Shortcut with a single parameter that Siri asks for each time the shortcut is run (built using an intentdefinition file). The shortcut works fine when I execute it from the Shortcuts app on my phone, but beginning with iOS 16.4, when I activate it by voice, Siri prompts me for the parameter and after I respond Siri says: "Sorry, could you say that again?". If I repeat the input, it just does a web search. This issue has been reported to us by several users. I was able to reproduce the issue with the build of our app from the App Store (built with Xcode 14.2) and with a local build of the app created with Xcode 14.3. When I run the intent in the debugger, the resolution method for the parameter gets called with an empty array for the parameter and we return [INStringResolutionResult needsValue]. Then Siri asks for the input and after I respond, Siri says "Sorry, could you say that again?"and the line Program ended with exit code: 15 is printed in the console. Is anyone else running into this? Is there anything else I can do to debug this?
1
0
1.3k
Apr ’23
UICollectionViewCompositionalLayout: How do I create an item in a UICollectionView that fills the UICollectionView's safe area?
I have a UICollectionViewController inside of a UINavigationController and when the collection view does not contain any data, I want to have a single item that fills the visible area of the collection view. I'm using UICollectionViewCompositionalLayout and thought I would be able to do this with a group / item size that has a fractionalWidth and fractionalHeight of 1.0 like this: let itemSize = NSCollectionLayoutSize(widthDimension: .fractionalWidth(1.0), heightDimension: .fractionalHeight(1.0)) let item = NSCollectionLayoutItem(layoutSize: itemSize) let groupSize = NSCollectionLayoutSize(widthDimension: .fractionalWidth(1.0), heightDimension: .fractionalHeight(1.0)) let group = NSCollectionLayoutGroup.horizontal(layoutSize: groupSize, subitems: [item]) let section = NSCollectionLayoutSection(group: group) let layout = UICollectionViewCompositionalLayout(section: section) However, this does not respect the collection view's safe area insets, so the item is too tall and ends up scrolling: Is there a way to have the fractionalHeight take the collection view's safe area into account or some other way to have a single item that fills the collection view's safe area?
1
0
755
Jan ’23
watchOS: Siri Shortcut works within Shortcuts app, but not when activated by voice
My app has a Siri Shortcut that works fine when I execute it from the Shortcuts app on my watch, but if I try to activate it by voice, Siri says: "Sorry, something's wrong. Please try again." and then a second later says "We've had a problem. Please try again." I'm currently on watchOS 8 beta 6. I last tested this in early July with beta 2 or beta 3 and it was working at that time. I haven't made any changes to the intent extension since then. The shortcut activates fine by voice on my phone, running iOS 15 beta 6. I'm also able to execute other Shortcuts by voice on my watch. Has anyone else run into this before? Is there a way to debug the intent extension on the watch, so I can see if any of my intent handler code is even being called?
6
0
2.7k
Aug ’21
Restoring the detail view for an offscreen row in a List with SwiftUI's state restoration
I have a NavigationView based app and I've implemented support for SwiftUI's state restoration using @SceneStorage to remember which detail view the user has open, as described in Apple's documentation. It's working correctly if the relevant row in the List is visible when I launch the app. However, if the relevant row is initially scrolled offscreen, the detail view doesn't get pushed until I scroll down so that the row is onscreen. I see the same behavior in Apple's sample app, which uses a LazyVGrid (I had to add some additional data to the products.json file such that the main content view had to scroll). Restoring the detail view when the row is offscreen does work in both my app and in Apple's sample app if I replace the List / LazyVGrid with a plain VStack. Is this a known limitation of SwiftUI's state restoration with List and the lazy stack / grid views?
0
1
730
Jun ’21