Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent.
Questions:
Has anyone made this to work on a real device?!
In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App!
Am I doing anything wrong? (adding @Parameter etc doesn't change anything).
Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria.
Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?!
Some code:
@available(iOS 18.2, *)
@AssistantIntent(schema: .system.search)
struct MySearchIntent: ShowInAppSearchResultsIntent {
static let searchScopes: [StringSearchScope] = [.general]
static let openAppWhenRun = true
var criteria: StringSearchCriteria
@MainActor
func perform() async throws -> some IntentResult {
NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri)
return .result()
}
}
Along with this ShortCut in AppShortcutsProvider:
AppShortcut(
intent: MySearchIntent(),
phrases: [
"Search \(.applicationName)"
],
shortTitle: "Search",
systemImageName: "magnifyingglass"
)
Post
Replies
Boosts
Views
Activity
Preparing an iPad App for native MacOS experience using Mac Catalyst (and AppKit when needed) and using SwiftUI.
We observe that Horizontal Scroll is tricky using a Mouse! Most users have the habit of click+drag to scroll horizontally. This does not seem to be default behavior on ScrollView.
Strange: Same App when compiled on "Mac (Designed for iPad)" [aka Silicon only] scrolls horizontally with click drag. But not if compiled with Mac Catalyst.
Question: How can we enable general click-drag for horizontal scrolling using SwiftUI on Mac Catalyst? I wouldn't mind using specific AppKit framework if needed.
Thanks in advance.
Xcode 14.0 introduced a work-around for a problem introduced in Swift 5.7 by setting the OTHER_SWIFT_FLAGS build option in Xcode to -Xfrontend -warn-redundant-requirements. (92092635)
The compiler crash and discussions are on the Swift GitHub Issues and pending resolution since.
The above workaround/flag is no longer available when upgrading to Xcode 14.3 (14.2 is fine) and projects using that flag simply fail.
Any hints?
Since updating to Xcode 13.3 (13E113), out WatchKit target fails to build in the ValidateEmbeddedBinary step with the error message: “error: The value of CFBundleShortVersionString in your WatchKit app’s Info.plist (X.Y.Z) does not match the value in your companion app’s Info.plist ((null)). These values are required to match.”
All info.plists look perfectly fine! This is only happening since 13.3.
The Release Note of 13.3 mentions this bug as fixed: https://developer.apple.com/documentation/xcode-release-notes/xcode-13_3-release-notes 🤷🏻♂️
Any one has come across this?!
For an App in the PlaybackAndRecording Category, would entering a Group Activity Session change the Category and Mode since FaceTime is ongoing?
will we still have access to Microphone locally using, e.g. AVAudioEngine?
if yes, will the presence of FaceTime force VoiceChat mode and VPIO in our session?
Thanks in advance.
When trying to access a kAudioOutputUnitProperty_OSWorkgroup the compiler errors by indicating:
'kAudioOutputUnitPropertyOSWorkgroup' is unavailable in Swift: Swift is not supported for use with audio realtime threads This makes total sense! But I'm just fetching a property here... . I am not_ in real-time thread! In the world of Mix-N-Match (between C, ObjC++ and Swift) we should be able to fetch a property like this when we're not in a critical block.
Or am I missing something here?!
I am trying to use PKToolPicker (from PencilKit) without a PKCanvasView by adopting PKToolPickerObserver in my own class.
I am wondering how to deal with the FirstResponder required by PKToolPicker visibility set from a SwiftUI App when the view containing the PKToolPicker is a UIViewRepresentable embedded in a SwiftUI View that has control of gestures.
Thanks in advance,
Arshia
Just saw the Audio Workgroup WWDC20 video. Thanks for bringing this up as it reduces uncertainty for most real-time audio Apps.
I am wondering about the availability of Workgroups. The speaker mentions Fall 2020. Is this already part of iOS 14? Is this only iOS 14+ feature or is there backward compatibility?
thanks in advance,
Arshia Cont