In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario:
My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ."
So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do.
What's the right way to define this behavior?
p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
SiriKit
RSS for tagHandle requests for your app’s services from users using Siri or Maps.
Posts under SiriKit tag
44 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
When I create a list of 6 options for disambiguation, option 5 is ignored by Siri. For example, I have the following code to resolve one of my parameters for a custom intent. When this code executes, Siri reads off the Disambiguation Prompt in the Intents Definition. Then it displays the options shown below. However, if you respond to the prompt with "No" or "No change it", the response is completely ignored. The resolve method is not even called with anything. But if you respond with "Yes continue" or "Yes I'm done", the resolve method is called and I can process the chosen option.
Does this seem like a bug or am I missing something? Does it have anything to do with the Intent Definition for this parameter having Paginate every ___ items set to 6?
NSMutableArray *options;
options = [[NSMutableArray alloc] init];
NSString *anOption = [NSString stringWithFormat:@"%@ myList", intent.partsListName]; // option 1
[options addObject:anOption];
anOption = @"option 1";
[options addObject:anOption];
anOption = @"option 2";
[options addObject:anOption];
anOption = @"option 3";
[options addObject:anOption];
anOption = [NSString stringWithFormat:@"Yes, continue"]; // option 4
[options addObject:anOption];
// Option 5
anOption = [NSString stringWithFormat:@"No, change it"];
[options addObject:anOption];
// Option 6
anOption = [NSString stringWithFormat:@"Yes, I'm done"];
[options addObject:anOption];
completion([INStringResolutionResult disambiguationWithStringsToDisambiguate:[options copy]]);
Hi,
I'm building an aftermarket solution to enable Apple Maps to support EV routing for any EV.
I am going through the documentation and found some gaps - does anyone know how the following properties work?
INGetCarPowerLevelStatusIntentResponse - consumptionFormulaArguments
INGetCarPowerLevelStatusIntentResponse - chargingFormulaArguments
Is there a working example that anyone has seen?
Many thanks
I have shortcuts up and running and I have my custom response added to my completion handler since day1.
Recently I upgraded to iOS18, and found out the app I develop can not display the custom response.
I test the app on iOS17.6, the display of custom response is no problem.
The situation is exactly like the problem posted on 2018: https://forums.developer.apple.com/forums/thread/109324
Can anyone help me or have the same bugs?
Thank you so much!
Happy 2025
We want to do below addition to iOS Mobile App.
Airpod announces Push notification = which is workking
now we want to use voice command that "Reply to this" and sending Reply to that notification but it is saying it is not supported in your app.
So basically we need to use feature - Listen and respond to messages with AirPods
Do we need to add any integration inside app for this or it will directly worked with Siri settings ?
Is it possible to do in non messaging App?
Is it possible to do without syncing contacts ?
If you add a (SiriKit / Widget) intent definition file to an Xcode project and then translate it into another language, the build of the iOS app only works until you close the project. As soon as you open the project again, you get the error message with the next build:
Unexpected duplicate tasks
A workaround for this bug is, that you convert the folder (where the intent file is located) in Xcode to a group. After that every thing works without problems.
Steps to reproduce:
Create a new iOS project
Add a localization to the project (German for example)
Add a SiriKit Intent Definition File
Localize the SiriKit Intent Definition File
Build the project (should work without errors)
Close the project
Open the project again
Build the project again
Expected result:
The project builds without problems
Current result:
The project doesn’t build and returns the error: Unexpected duplicate tasks
Is this a known problem? Is there a way to solve this without switching to Xcode groups (instead of folders)
WWDC videos suggest that existing apps should continue using the old SiriKit domains, such as INPlayMediaIntent. But what about new apps for playing audio? Should we implement Siri functionality for audio playback using the old SiriKit domains, or should we create our own AppEntities and trigger them via custom AudioPlaybackIntent implementations?
Interactive widgets require an AppIntent and don’t support the old INPlayMediaIntent. To achieve the same functionality as the Music app widgets, it seems logical to adopt the new AudioPlaybackIntent. However, I can't find any information about this in the documentation.
My requirement is to open a specific screen of my app with when user says " Start Sleep meditation for 10 minutes" where Sleep and 10 minutes are dynamic values in the phrase. Is it possible just with a phrase we can get the values. Or do i need to ask using siri "which meditation" and then "how much tine". I am planning to use AppIntent and AppShortcut, along with Entities. But unable to open the shortcut when siri invokes with phrase i discussed above.
I donate some INPlayMediaIntent to system, and I find them in Control center, when I click one of them to play media background, the handler don't execute resolve method, I wanna resolve some mediaItems for suggestion playlist
I have an app that's capable of playing podcasts via Siri requests, e.g. "Hey Siri, play [Podcast Name]". I’m using INPlayMediaIntentHandling, that is, the SiriKit domain intents, as opposed to the newer AppIntents framework for its ability to select my app for audio playback without the need to specify the name of the app in the user's request to Siri.
This works great overall for the many podcasts I’ve tested the app with, with the exception of one. There's a podcast called "The Headlines", and I when I test the app with the request "Hey Siri, play The Headlines", my app is never selected. Instead, Apple Podcasts begins playback of a show called "NPR News Now".
Oddly, if the Apple Podcasts app is deleted, my app will still not be selected by the system, and instead, Siri responds with "I don’t see an app for that. You’ll need to download one" with a button to open the App Store. Additionally, if I do add the app name to the request using this style of intent, Siri responds with "[App Name] hasn’t added support for that with Siri." However, I’d still like to accomplish this without requiring the app name in the Siri request.
There's nothing complex in my setup:
The target declares one supported intent, INPlayMediaIntent, with "Podcasts" selected as a supported media category.
The Siri entitlement is enabled.
My INSiriAuthorizationStatus is .authorized.
My intent handler is specified in my AppDelegate as follows:
func application(_ application: UIApplication,
handlerFor intent: INIntent) -> Any? {
return IntentHandler.shared
}
My intent handler is simple:
final class IntentHandler: NSObject, INPlayMediaIntentHandling {
static let shared = IntentHandler()
func handle(intent: INPlayMediaIntent) async -> INPlayMediaIntentResponse {
print("IntentHandler: processing intent: \(intent)")
/** code to start playback based on information found in `intent` **/
}
When requesting Siri to "Play The Headlines", my handler code is not called at all. For all other supported shows, the print statement executes, and playback begins as expected.
Is there any way I can get my app to be selected instead of Apple Podcasts for this request?
Hello everyone,
I'm currently working on an App Intent for my iOS app, and I’ve encountered a frustrating issue related to how Siri prompts for a category selection. Here’s an overview of what I’m dealing with:
extension Category: AppEntity, @unchecked Sendable {
var displayRepresentation: DisplayRepresentation {
DisplayRepresentation(title: "\(name)")
}
static var typeDisplayRepresentation = TypeDisplayRepresentation(name: "Category")
typealias DefaultQueryType = ShortcutsCategoryQuery
static var defaultQuery: ShortcutsCategoryQuery = ShortcutsCategoryQuery()
}
struct ShortcutsCategoryQuery: EntityQuery {
func entities(for identifiers: [String]) async throws -> [Category] {
let context = await ModelContext(sharedModelContainer)
let categories = try CategoryDataProvider(context: context).getItems()
return categories.filter { identifiers.contains($0.id) }
}
func entities(matching string: String) async throws -> [Category] {
return try await suggestedEntities()
}
func suggestedEntities() async throws -> [Category] {
let context = await ModelContext(sharedModelContainer)
do {
let categories = try CategoryDataProvider(context: context).getItems()
if categories.isEmpty {
print("No categories found.")
}
return categories.map { category in
Category(
id: category.id,
name: category.name,
stringSymbol: category.stringSymbol,
symbol: category.symbol,
stringColor: category.stringColor,
color: category.color
)
}
} catch {
print(error)
return []
}
}
}
The issue arises when I use Siri to invoke the intent. Siri correctly asks me to select a category but does not display any options unless I said something that Siri recognized, like "Casa(House) or *****(Test)" in portuguese. Only then does it show the list of available categories.
I would like the categories to appear immediately when Siri asks for a selection. I've already tried refining the ShortcutsCategoryQuery and debugging various parts of my code, but nothing seems to fix this behavior.
It would be good if we could tallk to Siri with emojis as well. I’m pretty sure emojis are her’s native language.
func handle(intent: INSendPaymentIntent, completion: @escaping (INSendPaymentIntentResponse) -> Void) {
if let _ = intent.payee, let currencyAmount = intent.currencyAmount {
let userActivity = NSUserActivity(activityType: "com.rapipay.nye.test.failure")
userActivity.userInfo = ["amount": Int(truncating: currencyAmount.amount!)]
completion(INSendPaymentIntentResponse(code: .failureRequiringAppLaunch, userActivity: userActivity ))
return
}
}
func application(_ application: UIApplication, continue userActivity: NSUserActivity, restorationHandler: @escaping ([any UIUserActivityRestoring]?) -> Void) -> Bool {
print("ddddd")
return true
}
this code is perfectly working on simulator and even i fi user userActivity as nil, continue userActivity is called, but on device it is not called
Siri is working, asking for name and amount and handler is also called and opens app but does not pass intent or userActivity
I am currently working on integrating an app with Siri, adding support for starting VOIP calls and sending messages. Although it is understood it is recommended to use SiriKit for calling and messaging, I would like to allow users to select a profile to use for calling. As far as I am aware the notion of selecting a profile to call from is not something SiriKit supports, therefore, it was decided to go with App Intents to allow for more control over the parameters utilized to start calls.
After integrating VOIP calling with App Intents, I noticed CallKit is not able to start calls when the App Intent is invoked from the background. I get the following error:
Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)”
This seems to correspond to the CXErrorCodeRequestTransactionError invalidAction. This error only happens when the intent is invoked from the background. Changing the App Intent property openAppWhenRun to true solves the issue as it brings the app to foreground before running the intent.
However, I would like to support starting calls from the background to avoid making users unlock their phones prior to starting a call with Siri to make it a truly hands-free experience. I suspect the desired behavior is possible, most likely with SiriKit, as some famous VOIP calling apps (i.e. WhatsApp, Messenger, etc) exhibit the behavior I described. However, is there any way to start calls from the background with App Intents? Or is the desired behavior something exclusive to SiriKit?
I have pasted three code snippets below that can replicate the issue. At the moment I am on Xcode Version 15.3, macOS Sonoma 14.6.1, and testing on iOS 16.6.1
To demonstrate the issue I have created the following CXProviderDelegate:
class CallManager: NSObject, CXProviderDelegate {
func startCall() {
let callKitProvider = CXProvider(configuration: CXProviderConfiguration())
callKitProvider.setDelegate(self, queue: nil)
let callKitController = CXCallController()
let recipient = CXHandle(type: .generic, value: "Demo Outgoing Call")
let uuid = UUID()
let startCallAction = CXStartCallAction(call: uuid, handle: recipient)
let transaction = CXTransaction(action: startCallAction)
callKitController.request(transaction) { error in
if let error {
print(error)
} else {
print("no errors")
}
}
callKitProvider.reportOutgoingCall(with: uuid, connectedAt: nil)
}
func providerDidReset(_ provider: CXProvider) {
// no-op, not required to demonstrate the issue
}
}
Then, I have a UIViewController that is the only screen of this example app:
class ViewController: UIViewController {
@IBOutlet weak var startCallButton: UIButton!
override func viewDidLoad() {
super.viewDidLoad()
startCallButton.addTarget(self, action: #selector(buttonTapped), for: .touchUpInside)
}
@objc func buttonTapped() {
let manager = CallManager()
manager.startCall()
}
}
As for app intents, I put together a very simple intent to trigger the start of an outgoing call:
struct StartCall: AppIntent {
static var title: LocalizedStringResource = "Start Call"
static var openAppWhenRun = false
func perform() async throws -> some IntentResult {
let manager = CallManager()
manager.startCall()
return .result()
}
}
When the UIViewController is presented and I tap the button to start a call I see the green call banner appear and "no errors" is printed to the console as intended. However, when I open the Shortcuts app and run the app intent, the green banner does not appear and the message Error Domain=com.apple.CallKit.error.requesttransaction Code=6 "(null)” is printed to the console.
It is no longer displayed as a target for shortcut app actions.
It was displayed until iOS 17.7.
Tried with iOS 18.0 and 18.0.1 but it does not appear.
Shortcuts created when under iOS 18 work fine.
Only INPlayMediaIntent is supported and is targeted at iOS 15 and above, so no Extension is used and is handled directly in the app.
Is anyone else suffering from the same problem?
Hello,
I have written the following app intent and I can access it via shortcuts. But I can't get Siri to pick it up. I want it to have a dynamic book title (which could be anything) so that the user can say "Add (bookname) to my (app name). I need it to work for ios 17.1 onwards. I have added siri as a capability for my ios app.
import AppIntents
struct AddBookToReadingListIntent: AppIntent {
static var title: LocalizedStringResource = "Add my Book"
@Parameter(title: "Book Title", requestValueDialog: "What's the title of the book you want to add?")
var bookTitle: String
static var parameterSummary: some ParameterSummary {
Summary("Add my '\(\.$bookTitle)'")
}
func perform() async throws -> some IntentResult & ReturnsValue<String> {
return .result(value: "Added '\(bookTitle)' to your app")
}
}
struct AppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AddBookToReadingListIntent(),
phrases: [
"Add \(\.$bookTitle) in \(.applicationName)"
],
shortTitle: "Add Book to app name",
systemImageName: "book"
)
}
}
I'm trying to add Siri support to my app for sending voice messages. I've implemented INSendMessageIntentHandling in my main app target.
It looks like it's getting as far as recording the voice message and passing my intent handler an INSendMessageIntent with an audio attachment, but I'm not able to read the attachment file.
func handle(
intent: INSendMessageIntent,
completion: @escaping (INSendMessageIntentResponse) -> Void
) {
if let attachment = intent.attachments?.first,
let audioFile = attachment.audioMessageFile,
let fileURL = audioFile.fileURL
{
// This branch runs
// fileURL is "file:///var/mobile/tmp/SiriMessages/89F738F7-6092-439A-B4FA-2DD9A99F0EED.caf"
let result = processMessageAudio(url: fileURL)
completion(result)
return
}
// This line isn't reached
completion(.init(code: .failure, userActivity: nil))
}
private func processMessageAudio(url: URL) -> INSendMessageIntentResponse {
var fileRef: ExtAudioFileRef?
if url.startAccessingSecurityScopedResource() {
logDebug("File access allowed")
} else {
// This branch runs
logDebug("File access not allowed")
}
defer {
url.stopAccessingSecurityScopedResource()
}
let openStatus = ExtAudioFileOpenURL(url as CFURL, &fileRef)
// openStatus is -54 (kAudio_FilePermissionError)
return INSendMessageIntentResponse(code: .failure, userActivity: nil)
}
I'm not sure what I'm missing. It looks like there should be an audio file, and Siri shows a preview of the audio for confirmation.
Hi All,
requirement - "Search (placeholder) in (myApp)".
When user speaks this strings, Siri should open the app and pass the placeholder.
This worked for me only when i used an AppEnum (with specific defined set) with AppEntity.
I want the placeholder to be dynamic and not defined via the AppEnum.
Have observed this feature working fine with Youtube, Spotify & Whatsapp apps.
Is there anything else that these app add specifically to make this work. ?
Also in these app's Siri settings, there is a toggle named - 'Use with Ask Siri'.
Could someone please help in understanding, how this option is enabled ?
Xcode Version 16.0 (16A242d)
iOS18 - Swift
There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question.
Example Code:
struct MyAppNameShortcuts: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskQuestionIntent(),
phrases: [
"Ask \(.applicationName) a question",
]
)
}
}
struct AskQuestionIntent: AppIntent {
static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question")
static var openAppWhenRun: Bool = false
static var parameterSummary: some ParameterSummary {
Summary("Search for \(\.$query)")
}
@Dependency
private var apiClient: MockApiClient
@Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?"))
var query: String
// perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog
// iOS 17, the same string is passed though
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("Query is: \(query)")
let queryResult = try await apiClient.askQuery(queryString: query)
let dialog = IntentDialog(
full: .init(stringLiteral: queryResult.answer),
supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...")
)
let view = SiriAnswerView(queryResult: queryResult)
return .result(dialog: dialog, view: view)
}
}
Given the above mock code:
iOS17:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
String of "What color is the moon?" is passed to the AppIntent
iOS18:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
Siri answers the question "What color is the moon?"
Follow above steps again and instead reply "Moon"
"Moon" is passed to AppIntent
Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
We were using custom UI with inbuilt sendPayment intent. Before 17.0 version, the final handle function would call the custom UI configureView method to show final success UI. However, post 17.0 this is not working the final handle function is not invoking configureView of UIExtension. Please suggest if we have any fix for this issue.