Siri and Voice

RSS for tag

Help users quickly accomplish tasks related to your app using just their voice.

Posts under Siri and Voice tag

52 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Issue with handoff between Siri intent, iOS app & CarPlay extension
We are experiencing an infrequent issue with the handoff between our Siri intent, our iOS app, and our CarPlay extension. Siri correctly understands the request, and the handler(for intent: INIntent) method is called. In the final step, we respond using: INStartCallIntentResponse(code: .continueInApp, userActivity: userActivity) with an instance of NSUserActivity initialized as: NSUserActivity(activityType: "our.unique.StartCallIntent") This "our.unique.StartCallIntent" type is included in the app’s NSUserActivityTypes attribute within the Info.plist. The callback is handled in the main view of the app through: view.onContinueUserActivity("our.unique.StartCallIntent", perform: handleSiriIntent) Additionally, we handle the callback in the CarPlay extension using: func scene(_: UIScene, continue userActivity: NSUserActivity) This is necessary because when Siri is invoked while CarPlay is active, the CarPlay extension should receive the callback. Most of the time, both callbacks are triggered as expected. However, on rare occasions, the handoff fails, and neither onContinueUserActivity nor scene(_: UIScene, continue userActivity:) receives a callback from the Siri intent. Is this a known issue? If so, are there any guidelines or best practices for ensuring that our Siri intent handoff consistently triggers the callbacks?
1
0
51
2d
Localized App Shortcuts phrases don't work with AppIntents consumed from a Framework
Imagine we have an Xcode workspace containing two projects: MyLibrary.xcodeproj holding a framework target MyShortcutsApp.xcodeproj holding an app target which consumes MyLibrary framework Both targets define App Intents and the ones from MyLibrary are exposed via AppIntentsPackage accordingly. When trying to wrap the App Intent from framework as App Shortcut and passing localized AppShortcutPhrases I do see the following compile error: ".../Resources/de.lproj/AppShortcuts.strings:11:1: error: This AppShortcut does not map to a known action (MyLibraryIntent specified). (in target 'MyShortcutsApp' from project 'MyShortcutsApp')" If I use the same localized App Shortcut phrases for an App Intent which is locally defined in the app target, everything works fine and also if I use the framework-provided App Intent in and App Shortcut without passing any localized phrases. This is happening with Xcode 16.0 (16A242d), with 16.1 (16B40) and with 16.2 beta 2 (16C5013f). I already raised this issue via FB15701779 which contains a sample project to reproduce and to further analyze the issue. Thanks for any hint on how to solve that. Frank
0
0
86
4d
Siri phrase with multiple dynamic values
My requirement is to open a specific screen of my app with when user says " Start Sleep meditation for 10 minutes" where Sleep and 10 minutes are dynamic values in the phrase. Is it possible just with a phrase we can get the values. Or do i need to ask using siri "which meditation" and then "how much tine". I am planning to use AppIntent and AppShortcut, along with Entities. But unable to open the shortcut when siri invokes with phrase i discussed above.
0
0
92
6d
Failed to generate TargetContentIdentifier for criteria
I have implemented ShowInAppSearchResultsIntent and AppShortcutsProvider. But on iOS 18.1+ getting and error in console :- Failed to generate TargetContentIdentifier for criteria. In iOS 18.0 it's working fine. The code I have implemented @AssistantIntent(schema: .system.search) struct SearchIntent: ShowInAppSearchResultsIntent { // static let title: LocalizedStringResource = "Search in Cineverse for" static let searchScopes: [StringSearchScope] = [.general] @Parameter(requestValueDialog: IntentDialog("What would you like to search for?")) var criteria: StringSearchCriteria @MainActor func perform() async throws -> some IntentResult { let searchString = criteria.term print("Searching for \(searchString)") return .result() } } class AppShortcuts: AppShortcutsProvider { static var appShortcuts: [AppShortcut] { AppShortcut( intent: SearchIntent(), phrases: [ "using \(.applicationName) search for", "search on \(.applicationName) app" ], shortTitle: "Search Movie", systemImageName: "magnifyingglass" ) } }
0
0
148
1w
AVSpeechUtterance Mandarin voice output replaced by SIRI language setting after upgraded the IOS to 18
Hi, Apple's engineer. Hoping that you can reply to this one. We're developing a Text-to-Speak app. Everything went well until the IOS got upgraded to 18. AVSpeechSynthesisVoice(language: "zh-CN") is running well under IOS 16 AND IOS 17. It speaks Mandarin correctly. In IOS 18, we noticed that Siri's Language setting interrupted the performance of AVSpeechSynthesisVoice. It plays Cantonese instead of Mandarin. Buggy language setting in Siri that affects the AVSpeechSynthesisVoice : Chinese (Cantonese - China mainland) Chinese (Cantonese -Hong Kong)
1
2
193
1w
INPlayMediaIntentHandling, Unable to Trigger App to Play Specific Podcast
I have an app that's capable of playing podcasts via Siri requests, e.g. "Hey Siri, play [Podcast Name]". I’m using INPlayMediaIntentHandling, that is, the SiriKit domain intents, as opposed to the newer AppIntents framework for its ability to select my app for audio playback without the need to specify the name of the app in the user's request to Siri. This works great overall for the many podcasts I’ve tested the app with, with the exception of one. There's a podcast called "The Headlines", and I when I test the app with the request "Hey Siri, play The Headlines", my app is never selected. Instead, Apple Podcasts begins playback of a show called "NPR News Now". Oddly, if the Apple Podcasts app is deleted, my app will still not be selected by the system, and instead, Siri responds with "I don’t see an app for that. You’ll need to download one" with a button to open the App Store. Additionally, if I do add the app name to the request using this style of intent, Siri responds with "[App Name] hasn’t added support for that with Siri." However, I’d still like to accomplish this without requiring the app name in the Siri request. There's nothing complex in my setup: The target declares one supported intent, INPlayMediaIntent, with "Podcasts" selected as a supported media category. The Siri entitlement is enabled. My INSiriAuthorizationStatus is .authorized. My intent handler is specified in my AppDelegate as follows: func application(_ application: UIApplication, handlerFor intent: INIntent) -> Any? { return IntentHandler.shared } My intent handler is simple: final class IntentHandler: NSObject, INPlayMediaIntentHandling { static let shared = IntentHandler() func handle(intent: INPlayMediaIntent) async -> INPlayMediaIntentResponse { print("IntentHandler: processing intent: \(intent)") /** code to start playback based on information found in `intent` **/ } When requesting Siri to "Play The Headlines", my handler code is not called at all. For all other supported shows, the print statement executes, and playback begins as expected. Is there any way I can get my app to be selected instead of Apple Podcasts for this request?
1
0
123
2w
Siri in iOS 18.1 only says "Done", "That's done" for the result with IntentDialog.
I'm returning the following result in one of my AppIntents: return .result(value: "Done!", dialog: IntentDialog("Speed limit \(speedLimit)")) With iOS 18.0.1 it was nicely confirming the user the result of their command by saying e.g. "Speed limit 60" and showing it on top of the screen. With iOS 18.1, it only shows/says "That's done" or "Done" at the bottom of the screen. Am I missing something that changed in the AppIntents API since iOS 18.1?
2
0
163
2w
VoiceOver needs to support CFBundleSpokenName
VoiceOver does not support the plist property CFBundleSpokenName. This is wrong and should be fixed. Ultimately the issue I am dealing with is that our app name is UWCU, and instead of VoiceOver pronouncing each letter, it tries to read this as a word and horribly butchers our organization's/app's name. Alternatives such as using U.W.C.U. and U W C U are not acceptable. @Apple, I know you're first response is going to be "no, it is working perfectly," but quite frankly you are wrong. I know you feel strongly about this, given your response in posts like this: https://forums.developer.apple.com/forums/thread/734545?answerId=760084022 HOWEVER, with iOS 18, your argument for "VoiceOver should read what's on the screen" doesn't hold water anymore. With iOS 18, you Apple have added a new feature that lets users customize their home screens and completely remove the name of apps. Here's your own guide: https://support.apple.com/guide/iphone/customize-apps-and-widgets-on-the-home-screen-iph385473442/ios Quoted from your guide: Make the icons bigger: Tap Large. (In large size, the names of the apps disappear.) With large icons + VoiceOver turned on, VoiceOver still reads the app name even though it has disappeared from the screen. So, your own argument "VoiceOver should read the text as it appears on the screen" is invalid, because there is NO text on the screen. If you can't tell, I'm pretty peeved about all this. There's a reason why screen readers support aria attributes to help deliver the right accessible experience. It's a simple ask for VoiceOver to do the same thing.
3
0
474
3w
Siri with apple intelligence is not working IOS 18.2 Beta
After I updated Siri doesn't work. If I try to use it by typing I get no feedback or response. If I say 'Siri' or hold down the side button the 'animation' on the screen starts but stops immediately and again I have no response or feedback. When turning off Apple Intelligence and just using Siri, it works perfectly. But in conjunction with Apple Intelligence, it is not possible to use Siri. IOS 18.2 Beta IPhone 15 pro Max
5
4
694
3w
App Intents: Siri does not recognize currency amounts
Hello, I am implementing an App Intent which asks the user for a currency amount: private func loadAmountList(forNumber number: String) async throws -> [NSDecimalNumber] {...} @MainActor func perform() async throws -> some IntentResult & ShowsSnippetView { let list = try await loadAmountList(forNumber: fixedNumber).compactMap { currencyFormatter.string(from: $0) } throw $amount.needsDisambiguationError(among: list, dialog: "app_intent_sim_amount_prompt") } If I start this intent from Siri, the attached screenshot is shown, but no matter what I say ("10 EURO", "ten", "10", "10€"...) Siri never understands anything and keep reshowing the dialog over and over again. If instead I tap any of the choices then the intent execution proceeds currectly. How can I solve the problem? Thanks
1
0
160
3w
App Intents: requestConfirmation method not working with Siri invocation
Hello, I am implementing an App Intent which shows a confirmation dialog before proceeding with the operation execution. It works fine when the intent is started from a shortcut, but it always fails when started from Siri: I obtain the error message depicted in the attached screenshot ("An error occurred, try again"). That message appears as soon as the requestConfirmation method is called in the perform method of my App Intent: try await requestConfirmation(actionName: .do, dialog: "app_intent_sim_confirmation_message") { SIMRechargeIntentSummaryView(...) } ... How can I solve the problem? Thanks
0
0
142
3w
is there anyone else having issues with the AirPods Max 2024 refresh using Siri?
I bought the new refreshed AirPods Max 2024 on pre-release day and ever since I got them in has been messed up when using the AirPods Max. Sometimes it’ll work sometimes it won’t. it started while I was on the 18.1 beta which I figured was probably the issue but now I’m on the 18.1 RC and it is still doing it. It’s not the new Siri because it’s not working with old Siri either most of the time. Sometimes it won’t even let me use my microphone when connected or read any of my notifications out loud.
1
0
164
3w
App Intents is not able to start an outgoing call with CallKit when the app is backgrounded
I am currently working on integrating an app with Siri, adding support for starting VOIP calls and sending messages. Although it is understood it is recommended to use SiriKit for calling and messaging, I would like to allow users to select a profile to use for calling. As far as I am aware the notion of selecting a profile to call from is not something SiriKit supports, therefore, it was decided to go with App Intents to allow for more control over the parameters utilized to start calls. After integrating VOIP calling with App Intents, I noticed CallKit is not able to start calls when the App Intent is invoked from the background. I get the following error: Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)” This seems to correspond to the CXErrorCodeRequestTransactionError invalidAction. This error only happens when the intent is invoked from the background. Changing the App Intent property openAppWhenRun to true solves the issue as it brings the app to foreground before running the intent. However, I would like to support starting calls from the background to avoid making users unlock their phones prior to starting a call with Siri to make it a truly hands-free experience. I suspect the desired behavior is possible, most likely with SiriKit, as some famous VOIP calling apps (i.e. WhatsApp, Messenger, etc) exhibit the behavior I described. However, is there any way to start calls from the background with App Intents? Or is the desired behavior something exclusive to SiriKit? I have pasted three code snippets below that can replicate the issue. At the moment I am on Xcode Version 15.3, macOS Sonoma 14.6.1, and testing on iOS 16.6.1 To demonstrate the issue I have created the following CXProviderDelegate: class CallManager: NSObject, CXProviderDelegate { func startCall() { let callKitProvider = CXProvider(configuration: CXProviderConfiguration()) callKitProvider.setDelegate(self, queue: nil) let callKitController = CXCallController() let recipient = CXHandle(type: .generic, value: "Demo Outgoing Call") let uuid = UUID() let startCallAction = CXStartCallAction(call: uuid, handle: recipient) let transaction = CXTransaction(action: startCallAction) callKitController.request(transaction) { error in if let error { print(error) } else { print("no errors") } } callKitProvider.reportOutgoingCall(with: uuid, connectedAt: nil) } func providerDidReset(_ provider: CXProvider) { // no-op, not required to demonstrate the issue } } Then, I have a UIViewController that is the only screen of this example app: class ViewController: UIViewController { @IBOutlet weak var startCallButton: UIButton! override func viewDidLoad() { super.viewDidLoad() startCallButton.addTarget(self, action: #selector(buttonTapped), for: .touchUpInside) } @objc func buttonTapped() { let manager = CallManager() manager.startCall() } } As for app intents, I put together a very simple intent to trigger the start of an outgoing call: struct StartCall: AppIntent { static var title: LocalizedStringResource = "Start Call" static var openAppWhenRun = false func perform() async throws -> some IntentResult { let manager = CallManager() manager.startCall() return .result() } } When the UIViewController is presented and I tap the button to start a call I see the green call banner appear and "no errors" is printed to the console as intended. However, when I open the Shortcuts app and run the app intent, the green banner does not appear and the message Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)” is printed to the console.
1
0
189
4w
How can I access audio attachment from INSendMessageIntent
I'm trying to add Siri support to my app for sending voice messages. I've implemented INSendMessageIntentHandling in my main app target. It looks like it's getting as far as recording the voice message and passing my intent handler an INSendMessageIntent with an audio attachment, but I'm not able to read the attachment file. func handle( intent: INSendMessageIntent, completion: @escaping (INSendMessageIntentResponse) -> Void ) { if let attachment = intent.attachments?.first, let audioFile = attachment.audioMessageFile, let fileURL = audioFile.fileURL { // This branch runs // fileURL is "file:///var/mobile/tmp/SiriMessages/89F738F7-6092-439A-B4FA-2DD9A99F0EED.caf" let result = processMessageAudio(url: fileURL) completion(result) return } // This line isn't reached completion(.init(code: .failure, userActivity: nil)) } private func processMessageAudio(url: URL) -> INSendMessageIntentResponse { var fileRef: ExtAudioFileRef? if url.startAccessingSecurityScopedResource() { logDebug("File access allowed") } else { // This branch runs logDebug("File access not allowed") } defer { url.stopAccessingSecurityScopedResource() } let openStatus = ExtAudioFileOpenURL(url as CFURL, &fileRef) // openStatus is -54 (kAudio_FilePermissionError) return INSendMessageIntentResponse(code: .failure, userActivity: nil) } I'm not sure what I'm missing. It looks like there should be an audio file, and Siri shows a preview of the audio for confirmation.
2
0
222
Oct ’24
CarPlay picking up from car microphone and phone microphone simultaneously
Hello! When trying to talk to text on CarPlay Siri would trip out, as if it were receiving a bunch of concurrent audio inputs. When I tried to video the issue with my phone it worked perfectly. My guess is when videoing the video app was now using my phones mic so it wasn't available for CarPlay and it only received audio input from the car microphone. It's weird because other voice commands and phone calls work just fine. This also happens on every iPhone we connect. My guess is there's some bug where the phone's microphone isn't being shut off properly when plugged in for CarPlay. I reformatted the Alpine iLX-W670 radio and updated firmware. Got new authentic Apple cable Not sure what else to try.
1
0
238
Sep ’24
Siri's voice invocation to open App and pass the intent
Hi All, requirement - "Search (placeholder) in (myApp)". When user speaks this strings, Siri should open the app and pass the placeholder. This worked for me only when i used an AppEnum (with specific defined set) with AppEntity. I want the placeholder to be dynamic and not defined via the AppEnum. Have observed this feature working fine with Youtube, Spotify & Whatsapp apps. Is there anything else that these app add specifically to make this work. ? Also in these app's Siri settings, there is a toggle named - 'Use with Ask Siri'. Could someone please help in understanding, how this option is enabled ?
5
0
384
Sep ’24
iOS 18: Siri not passing string parameters to AppIntents if the string is a question
Xcode Version 16.0 (16A242d) iOS18 - Swift There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question. Example Code: struct MyAppNameShortcuts: AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: AskQuestionIntent(), phrases: [ "Ask \(.applicationName) a question", ] ) } } struct AskQuestionIntent: AppIntent { static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question") static var openAppWhenRun: Bool = false static var parameterSummary: some ParameterSummary { Summary("Search for \(\.$query)") } @Dependency private var apiClient: MockApiClient @Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?")) var query: String // perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog // iOS 17, the same string is passed though @MainActor func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView { print("Query is: \(query)") let queryResult = try await apiClient.askQuery(queryString: query) let dialog = IntentDialog( full: .init(stringLiteral: queryResult.answer), supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...") ) let view = SiriAnswerView(queryResult: queryResult) return .result(dialog: dialog, view: view) } } Given the above mock code: iOS17: Hey Siri Ask (AppName) a question Siri responds "What would you like to ask?" Say "What color is the moon?" String of "What color is the moon?" is passed to the AppIntent iOS18: Hey Siri Ask (AppName) a question Siri responds "What would you like to ask?" Say "What color is the moon?" Siri answers the question "What color is the moon?" Follow above steps again and instead reply "Moon" "Moon" is passed to AppIntent Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
1
0
474
Oct ’24