Siri and Voice

RSS for tag

Help users quickly accomplish tasks related to your app using just their voice.

Posts under Siri and Voice tag

61 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

On iOS 18, Mandarin is read aloud as Cantonese
Please include the line below in follow-up emails for this request. Case-ID: 11089799 When using AVSpeechUtterance and setting it to play in Mandarin, if Siri is set to Cantonese on iOS 18, it will be played in Cantonese. There is no such issue on iOS 17 and 16. 1.let utterance = AVSpeechUtterance(string: textView.text) let voice = AVSpeechSynthesisVoice(language: "zh-CN") utterance.voice = voice 2.In the phone settings, Siri is set to Cantonese
1
0
113
1w
The "right" way to add parameters to Siri voice operations
In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario: My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ." So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do. What's the right way to define this behavior? p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
2
0
311
6d
Conforming an existing AppIntent to the photos domain schema
I have an image based app with albums, except in my app, albums are known as galleries. When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain. Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”. Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings: Parameter argument title of a required Assistant schema intent parameter target should not be overridden Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user. FB16283840
0
0
221
1w
AssistantIntent system.search behaviour
Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent. Questions: Has anyone made this to work on a real device?! In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App! Am I doing anything wrong? (adding @Parameter etc doesn't change anything). Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria. Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?! Some code: @available(iOS 18.2, *) @AssistantIntent(schema: .system.search) struct MySearchIntent: ShowInAppSearchResultsIntent { static let searchScopes: [StringSearchScope] = [.general] static let openAppWhenRun = true var criteria: StringSearchCriteria @MainActor func perform() async throws -> some IntentResult { NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri) return .result() } } Along with this ShortCut in AppShortcutsProvider: AppShortcut( intent: MySearchIntent(), phrases: [ "Search \(.applicationName)" ], shortTitle: "Search", systemImageName: "magnifyingglass" )
0
0
119
1w
Is it possible to Add Reply of Push Notification from Airpod using Voice ?
We want to do below addition to iOS Mobile App. Airpod announces Push notification = which is workking now we want to use voice command that "Reply to this" and sending Reply to that notification but it is saying it is not supported in your app. So basically we need to use feature - Listen and respond to messages with AirPods Do we need to add any integration inside app for this or it will directly worked with Siri settings ? Is it possible to do in non messaging App? Is it possible to do without syncing contacts ?
0
0
147
3w
AppIntents: Shortcuts phrases with parameters are not resolving correctly
Exploring AppIntents and Shortcuts. No matter what I try Siri won't understand parameters in an initial spoken phrase. For example, if I ask: "Start my planning for School in TestApp", Siri responds: "What's plan?", I say: "School" and Siri responds "Ok, starting Shool plan" What am I missing so it won't pick up parameters right away? Logs inside func entities(matching string: String) are only called after "What's plan?" question and me answering "School". No logs after the initial phrase Tried to use Apple's Trails example as a reference but with no luck
1
0
186
Dec ’24
Making onscreen content available to Siri not requesting my Transferable
Howdy, I'm following along with this sample: https://developer.apple.com/documentation/appintents/making-onscreen-content-available-to-siri-and-apple-intelligence I've got everything up and building. I can confirm that the userActivity modifier is associating my App Intent via EntityIdentifier but my custom Transferable representation (text) is never being called and when Siri is doing the ChatGPT handoff, it's just offering to send a screenshot which is what it does when it has no custom representation. What could I doing wrong? Where should I be looking?
3
0
389
Dec ’24
Issue with willResignActiveNotification and didBecomeActiveNotification Not Triggering on Siri Setup Screen in iOS 18
Hello, I have noticed an issue in iOS 18 where UIApplication.willResignActiveNotification and UIApplication.didBecomeActiveNotification are no longer triggered when the Siri setup screen is presented or dismissed. This behavior was working as expected in iOS 17 and earlier. This change impacts the logic in our app that relies on detecting app activation and deactivation to perform critical tasks. I would like to confirm whether this change in behavior is an intentional modification in iOS 18 or an unintended bug. Thank you for your assistance. Any additional information or guidance would be greatly appreciated. Test Environment: iOS 17: Notifications triggered as expected iOS 18: willResignActiveNotification and didBecomeActiveNotification not triggered Conditions: Occurs when presenting or dismissing the Siri setup screen
1
0
236
Dec ’24
Issue with handoff between Siri intent, iOS app & CarPlay extension
We are experiencing an infrequent issue with the handoff between our Siri intent, our iOS app, and our CarPlay extension. Siri correctly understands the request, and the handler(for intent: INIntent) method is called. In the final step, we respond using: INStartCallIntentResponse(code: .continueInApp, userActivity: userActivity) with an instance of NSUserActivity initialized as: NSUserActivity(activityType: "our.unique.StartCallIntent") This "our.unique.StartCallIntent" type is included in the app’s NSUserActivityTypes attribute within the Info.plist. The callback is handled in the main view of the app through: view.onContinueUserActivity("our.unique.StartCallIntent", perform: handleSiriIntent) Additionally, we handle the callback in the CarPlay extension using: func scene(_: UIScene, continue userActivity: NSUserActivity) This is necessary because when Siri is invoked while CarPlay is active, the CarPlay extension should receive the callback. Most of the time, both callbacks are triggered as expected. However, on rare occasions, the handoff fails, and neither onContinueUserActivity nor scene(_: UIScene, continue userActivity:) receives a callback from the Siri intent. Is this a known issue? If so, are there any guidelines or best practices for ensuring that our Siri intent handoff consistently triggers the callbacks?
1
0
178
Nov ’24
Localized App Shortcuts phrases don't work with AppIntents consumed from a Framework
Imagine we have an Xcode workspace containing two projects: MyLibrary.xcodeproj holding a framework target MyShortcutsApp.xcodeproj holding an app target which consumes MyLibrary framework Both targets define App Intents and the ones from MyLibrary are exposed via AppIntentsPackage accordingly. When trying to wrap the App Intent from framework as App Shortcut and passing localized AppShortcutPhrases I do see the following compile error: ".../Resources/de.lproj/AppShortcuts.strings:11:1: error: This AppShortcut does not map to a known action (MyLibraryIntent specified). (in target 'MyShortcutsApp' from project 'MyShortcutsApp')" If I use the same localized App Shortcut phrases for an App Intent which is locally defined in the app target, everything works fine and also if I use the framework-provided App Intent in and App Shortcut without passing any localized phrases. This is happening with Xcode 16.0 (16A242d), with 16.1 (16B40) and with 16.2 beta 2 (16C5013f). I already raised this issue via FB15701779 which contains a sample project to reproduce and to further analyze the issue. Thanks for any hint on how to solve that. Frank
1
4
387
Dec ’24
Siri phrase with multiple dynamic values
My requirement is to open a specific screen of my app with when user says " Start Sleep meditation for 10 minutes" where Sleep and 10 minutes are dynamic values in the phrase. Is it possible just with a phrase we can get the values. Or do i need to ask using siri "which meditation" and then "how much tine". I am planning to use AppIntent and AppShortcut, along with Entities. But unable to open the shortcut when siri invokes with phrase i discussed above.
0
0
217
Nov ’24
Failed to generate TargetContentIdentifier for criteria
I have implemented ShowInAppSearchResultsIntent and AppShortcutsProvider. But on iOS 18.1+ getting and error in console :- Failed to generate TargetContentIdentifier for criteria. In iOS 18.0 it's working fine. The code I have implemented @AssistantIntent(schema: .system.search) struct SearchIntent: ShowInAppSearchResultsIntent { // static let title: LocalizedStringResource = "Search in Cineverse for" static let searchScopes: [StringSearchScope] = [.general] @Parameter(requestValueDialog: IntentDialog("What would you like to search for?")) var criteria: StringSearchCriteria @MainActor func perform() async throws -> some IntentResult { let searchString = criteria.term print("Searching for \(searchString)") return .result() } } class AppShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: SearchIntent(), phrases: [ "using \(.applicationName) search for", "search on \(.applicationName) app" ], shortTitle: "Search Movie", systemImageName: "magnifyingglass" ) } }
1
0
343
Dec ’24
AVSpeechUtterance Mandarin voice output replaced by SIRI language setting after upgraded the IOS to 18
Hi, Apple's engineer. Hoping that you can reply to this one. We're developing a Text-to-Speak app. Everything went well until the IOS got upgraded to 18. AVSpeechSynthesisVoice(language: "zh-CN") is running well under IOS 16 AND IOS 17. It speaks Mandarin correctly. In IOS 18, we noticed that Siri's Language setting interrupted the performance of AVSpeechSynthesisVoice. It plays Cantonese instead of Mandarin. Buggy language setting in Siri that affects the AVSpeechSynthesisVoice : Chinese (Cantonese - China mainland) Chinese (Cantonese -Hong Kong)
3
3
412
1w
INPlayMediaIntentHandling, Unable to Trigger App to Play Specific Podcast
I have an app that's capable of playing podcasts via Siri requests, e.g. "Hey Siri, play [Podcast Name]". I’m using INPlayMediaIntentHandling, that is, the SiriKit domain intents, as opposed to the newer AppIntents framework for its ability to select my app for audio playback without the need to specify the name of the app in the user's request to Siri. This works great overall for the many podcasts I’ve tested the app with, with the exception of one. There's a podcast called "The Headlines", and I when I test the app with the request "Hey Siri, play The Headlines", my app is never selected. Instead, Apple Podcasts begins playback of a show called "NPR News Now". Oddly, if the Apple Podcasts app is deleted, my app will still not be selected by the system, and instead, Siri responds with "I don’t see an app for that. You’ll need to download one" with a button to open the App Store. Additionally, if I do add the app name to the request using this style of intent, Siri responds with "[App Name] hasn’t added support for that with Siri." However, I’d still like to accomplish this without requiring the app name in the Siri request. There's nothing complex in my setup: The target declares one supported intent, INPlayMediaIntent, with "Podcasts" selected as a supported media category. The Siri entitlement is enabled. My INSiriAuthorizationStatus is .authorized. My intent handler is specified in my AppDelegate as follows: func application(_ application: UIApplication, handlerFor intent: INIntent) -> Any? { return IntentHandler.shared } My intent handler is simple: final class IntentHandler: NSObject, INPlayMediaIntentHandling { static let shared = IntentHandler() func handle(intent: INPlayMediaIntent) async -> INPlayMediaIntentResponse { print("IntentHandler: processing intent: \(intent)") /** code to start playback based on information found in `intent` **/ } When requesting Siri to "Play The Headlines", my handler code is not called at all. For all other supported shows, the print statement executes, and playback begins as expected. Is there any way I can get my app to be selected instead of Apple Podcasts for this request?
1
0
278
Nov ’24
Siri in iOS 18.1 only says "Done", "That's done" for the result with IntentDialog.
I'm returning the following result in one of my AppIntents: return .result(value: "Done!", dialog: IntentDialog("Speed limit \(speedLimit)")) With iOS 18.0.1 it was nicely confirming the user the result of their command by saying e.g. "Speed limit 60" and showing it on top of the screen. With iOS 18.1, it only shows/says "That's done" or "Done" at the bottom of the screen. Am I missing something that changed in the AppIntents API since iOS 18.1?
2
0
314
Nov ’24
VoiceOver needs to support CFBundleSpokenName
VoiceOver does not support the plist property CFBundleSpokenName. This is wrong and should be fixed. Ultimately the issue I am dealing with is that our app name is UWCU, and instead of VoiceOver pronouncing each letter, it tries to read this as a word and horribly butchers our organization's/app's name. Alternatives such as using U.W.C.U. and U W C U are not acceptable. @Apple, I know you're first response is going to be "no, it is working perfectly," but quite frankly you are wrong. I know you feel strongly about this, given your response in posts like this: https://forums.developer.apple.com/forums/thread/734545?answerId=760084022 HOWEVER, with iOS 18, your argument for "VoiceOver should read what's on the screen" doesn't hold water anymore. With iOS 18, you Apple have added a new feature that lets users customize their home screens and completely remove the name of apps. Here's your own guide: https://support.apple.com/guide/iphone/customize-apps-and-widgets-on-the-home-screen-iph385473442/ios Quoted from your guide: Make the icons bigger: Tap Large. (In large size, the names of the apps disappear.) With large icons + VoiceOver turned on, VoiceOver still reads the app name even though it has disappeared from the screen. So, your own argument "VoiceOver should read the text as it appears on the screen" is invalid, because there is NO text on the screen. If you can't tell, I'm pretty peeved about all this. There's a reason why screen readers support aria attributes to help deliver the right accessible experience. It's a simple ask for VoiceOver to do the same thing.
3
0
622
Oct ’24