Apple Intelligence

RSS for tag

Apple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.

Posts under Apple Intelligence tag

92 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Spotlight results | AppShortcut with AppEntity parameter vs CSSearchableItem.associateAppEntity
I've been exploring the Trails Sample App from this session at WWDC24. The app has a TrailEntity of type AppEntity which is leveraged in multiple places throughout the app, including: The GetTrailInfo App Intent with a trail parameter of type TrailEntity. A parameterized App Shortcut which calls the GetTrailInfo intent. The TrailDataManager's init calls updateSpotlightIndex(), which creates a CSSearchableItem for each Trail in the app, along with an associateAppEntity call linking the corresponding TrailEntity to each item that gets added to the CSSearchableIndex. If you build the app and search "trails" in Spotlight, the Trails Sample App section includes instances of TrailEntity as search results. But if you comment out the App Shortcut that takes a TrailEntity as a parameter and rebuild, there are no instances of TrailEntity in the search results. In both cases, the console prints [Spotlight] Trails indexed by Spotlight. Is this expected behavior? Why are the TrailEntity instances only appearing in Spotlight via the App Shortcut? Shouldn't the CSSearchableItem instances show up in Spotlight on their own regardless? If not, then what is the purpose of adopting Core Spotlight with App Entities? Does this add the app entities to the semantic index for "new Siri", even though they're not user facing in the Spotlight UI?
0
0
106
3d
The "right" way to add parameters to Siri voice operations
In this thread, I asked about adding parameters to App Shortcuts. The conclusion that I've drawn so far is that for App Shortcuts, there cannot be any parameters in the prompt, otherwise the system cannot find the AppShortcutsProvider. While this is fine for Shortcuts and non-voice interaction, I'd like to find a way to add parameters to the prompt. Here is the scenario: My app controls a device that displays some content on "pages." The pages are defined in an AppEnum, which I use for Shortcuts integration via App Intents. The App Intent functions as expected, and is able to change the page based on the user selection within Shortcuts (or prompted if using the App Shortcut). What I'd like to do is allow the user to be able to say "Siri, open with ." So far, The closest I've come to understanding how this works is through the .intentsdefinition file you can create (and SiriKit in general), however the part that really confused me there is a button in the File Editor that says "Convert to App Intent." To me, this means that I should be able to use the app intent I've already authored and hook that into Siri, rather than making an entirely new function/code-block that does exactly the same thing. Ideally, that's what I want to do. What's the right way to define this behavior? p.s. If I had to pick an intent schema in the context of AssistantSchemas, I'd say it's closest to the "Open File" one, if that helps. I'd ultimately like to make the "pages" user-customizable so in the long run, that would be what I'd do.
2
0
321
1w
Conforming an existing AppIntent to the photos domain schema
I have an image based app with albums, except in my app, albums are known as galleries. When I tried to conform my existing OpenGalleryIntent with @AssistantIntent(schema: .photos.openAlbum), I had to change my existing gallery parameter to be called target in order to fit the predefined shape of this domain. Previously, my intent was configured to display as “Open Gallery” with the description “Opens the selected Gallery” in the Shortcuts app. After conforming to the photos domain, it displays as “Open Album” with a description “Opens the Provided Album”. Shortcuts is ignoring my configured title and description now. My code builds, but with the following build warnings: Parameter argument title of a required Assistant schema intent parameter target should not be overridden Implementation of the property title of an AppIntent conforming to AssistantSchemaIntent should not be overridden Implementation of the property description of an AppIntent conforming to AssistantSchemaIntent should not be overridden Is my only option to change the concept of a Gallery inside of my app into an Album? I don't want to do this... Conceptually, my app aligns well with this domain does, but I didn't consider that conforming to the shape of an AI schema intent would also dictate exactly how it's presented to the user. FB16283840
0
0
226
1w
AssistantIntent system.search behaviour
Given that iOS 18.2 is out and following documentation and WWDC example (limited to iOS 18.2+), I am attempting to use @AssistantIntent(schema: .system.search) along an AppIntent. Questions: Has anyone made this to work on a real device?! In my case (code below): when I run the intent from Shortcuts or Siri, it does NOT open the App but only calls the perform method (and the App is not foregrounded) -- changing openAppWhenRun has no effect! Strangely: If my App was backgrounded before invocation and I foreground it after, it has navigated to Search but just not foregrounded the App! Am I doing anything wrong? (adding @Parameter etc doesn't change anything). Where is the intelligence here? The criteria parameter can NOT be used in the Siri phrase -- build error if you try that since only AppEntity/AppEnum is permitted as variable in Siri phrase but not a StringSearchCriteria. Said otherwise: What's the gain in using @AssistantIntent(schema: .system.search) vs a regular AppIntent in this case?! Some code: @available(iOS 18.2, *) @AssistantIntent(schema: .system.search) struct MySearchIntent: ShowInAppSearchResultsIntent { static let searchScopes: [StringSearchScope] = [.general] static let openAppWhenRun = true var criteria: StringSearchCriteria @MainActor func perform() async throws -> some IntentResult { NavigationHandler().to(.search(.init(query: criteria.term)), from: .siri) return .result() } } Along with this ShortCut in AppShortcutsProvider: AppShortcut( intent: MySearchIntent(), phrases: [ "Search \(.applicationName)" ], shortTitle: "Search", systemImageName: "magnifyingglass" )
0
0
121
1w
Error downloading the Predictive Code Completion Model
Hi, I'm setting up Xcode and after updating MacOS to 15.2 and Xcode to 16.2, I cannot download the Predictive Code Completion Model. I've done some research and I haven't found any solution, specially since most people suggest that it randomly fixed itself, which for me hasn't been the case. I've tried restarting my Mac, uninstalling and reinstalling Xcode multiple times, I've also reinstalled MacOS from Recovery and haven't found any success (I'm also not running Xcode inside a VM, as I've seen this can cause some problems in this case). This is the error that I receive when I attempt to download the model: And here are the details: The operation couldn’t be completed. (IDELanguageModelKit.IDEModelDownloadAdapter.(unknown context at $11eba9a90).DownloadError error 3.) Domain: IDELanguageModelKit.IDEModelDownloadAdapter.(unknown context at $11eba9a90).DownloadError Code: 3 User Info: { DVTErrorCreationDateKey = "2024-12-26 23:09:25 +0000"; } -- There was an error processing the asset. Domain: IDELanguageModelKit.IDEModelDownloadAdapter.(unknown context at $11eba9a90).DownloadError Code: 3 -- System Information macOS Version 15.2 (Build 24C101) Xcode 16.2 (23507) (Build 16C5032a) Timestamp: 2024-12-26T17:09:25-06:00 For the record, I'm a new developer and I'm still learning, so thank you very much!
1
1
250
3w
Apple Intelligence crashed/stopped working
Hi everyone, I’m currently using macOS Version 15.3 Beta (24D5034f), and I’m encountering an issue with Apple Intelligence. The image generation tools seem to work fine, but everything else shows a message saying that it’s “not available at this time.” I’ve tried restarting my Mac and double-checked my settings, but the problem persists. Is anyone else experiencing this issue on the beta version? Are there any fixes or settings I might be overlooking? Any help or insights would be greatly appreciated! Thanks in advance!
1
1
313
Dec ’24
Crash with UITextView intelligenceCollectContent Apple Intelligence
Hello, I am using Xcode 15.3 and doesn't add any support for Apple Intelligence yet as it requires Xcode 16+. But I see many crashlogs related to Apple Intelligence, I assume crash is related to UITextView element of UIKit. Can you please help me to resolve this issue, how can I fix it? There are few lines of the crashlog, they are all similar. Thread 0 name: Thread 0 Crashed: 0 UIKitCore 0x0000000196692d74 specialized UITextView._intelligenceCollectContent(in:collector:) + 2748 (UITextView_IntelligenceSupport.swift:37) 1 UIKitCore 0x0000000196691eb0 @objc UITextView._intelligenceCollectContent(in:collector:) + 60 (<compiler-generated>:17) 2 UIIntelligenceSupport 0x000000026b91f1e4 UIIntelligenceElementCollector.performCollection(_:) + 424 (UIIntelligenceElementCollector.swift:46) 3 UIKitCore 0x0000000196473b70 specialized UIView._intelligenceElement(in:using:transformToRoot:) + 1440 (UIView_IntelligenceSupport.swift:132) 4 UIKitCore 0x000000019647a800 specialized UIView._intelligenceCollectElement(for:in:using:transformToRoot:) + 424 (UIView_IntelligenceSupport.swift:84) 5 UIKitCore 0x00000001964792ec @objc UIView._intelligenceCollectElement(for:in:using:transformToRoot:) + 136 (<compiler-generated>:77) 6 UIKitCore 0x0000000196472c20 closure #1 in UIView._intelligenceCollectSubelements(in:using:transformToRoot:) + 256 (UIView_IntelligenceSupport.swift:55) 7 UIIntelligenceSupport 0x000000026b91f728 UIIntelligenceElementCollector.performElementCollection(_:) + 424 (UIIntelligenceElementCollector.swift:61) v 8 UIKitCore 0x00000001964728f4 UIView._intelligenceCollectSubelements(in:using:transformToRoot:) + 928 (UIView_IntelligenceSupport.swift:54) 9 UIKitCore 0x0000000196473354 @objc UIView._intelligenceCollectSubelements(in:using:transformToRoot:) + 136 (<compiler-generated>:0) 10 UIKitCore 0x0000000196479810 closure #3 in UIView._intelligenceElement(in:using:transformToRoot:) + 256 (UIView_IntelligenceSupport.swift:165) 11 UIIntelligenceSupport 0x000000026b91fb54 UIIntelligenceElementCollector.performElementArrayCollection(_:) + 144 (UIIntelligenceElementCollector.swift:78) ... ... 225 UIIntelligenceSupport 0x000000026b8ee630 closure #1 in IntelligenceCollectionListener.collectFragments(_:) + 192 (IntelligenceCollectionListener.swift:60) 226 UIIntelligenceSupport 0x000000026b91b138 thunk for @escaping @callee_guaranteed () -> () + 36 227 CoreFoundation 0x000000019376b6e4 __CFRUNLOOP_IS_CALLING_OUT_TO_A_BLOCK__ + 28 (CFRunLoop.c:1818) 228 CoreFoundation 0x0000000193759910 __CFRunLoopDoBlocks + 356 (CFRunLoop.c:1860) 229 CoreFoundation 0x00000001937595f4 __CFRunLoopRun + 2432 (CFRunLoop.c:3217) 230 CoreFoundation 0x0000000193758830 CFRunLoopRunSpecific + 588 (CFRunLoop.c:3434) 231 GraphicsServices 0x00000001df7381c4 GSEventRunModal + 164 (GSEvent.c:2196) 232 UIKitCore 0x00000001962beeb0 -[UIApplication _run] + 816 (UIApplication.m:3844)
3
0
344
Dec ’24
Making onscreen content available to Siri not requesting my Transferable
Howdy, I'm following along with this sample: https://developer.apple.com/documentation/appintents/making-onscreen-content-available-to-siri-and-apple-intelligence I've got everything up and building. I can confirm that the userActivity modifier is associating my App Intent via EntityIdentifier but my custom Transferable representation (text) is never being called and when Siri is doing the ChatGPT handoff, it's just offering to send a screenshot which is what it does when it has no custom representation. What could I doing wrong? Where should I be looking?
3
0
396
Dec ’24
iOS 18.2 Heavy Delays in Notifications with AI enabled
I am experiencing heavy delays with any push notification if I have AI enabled. Authenticator takes minutes to show up, the app for the doorbell that, needless to say, requires a real-time notification, fails to deliver push notifications in time. I have had to turn off AI in order to have a functioning phone... (I'm on a 16 Pro) Any resolution on this? Thanks
2
1
416
Dec ’24
Does Apple Intelligence Extensions Have an API?
Hi everyone, On the "Apple Intelligence & Siri" settings there's a section titled "Extensions" that specifically mentions ChatGPT. This got me curious—does Apple provide an API or SDK for developers to create custom integrations or use Apple Intelligence Extensions? Or is this currently limited to the Apple/OpenAI partnership? I appreciate any insights or links to relevant documentation. Here's a screenshot of what I mean: https://imgur.com/a/4MuQkIJ
0
1
370
Dec ’24
Image Playground not available for "Designed for iPad" apps?
I'm currently trying to add support for Image Playground to our apps. It seems that it's not working in an app that is "Designed for iPad" and runs on a Mac. The modal just shows a spinner and the following is logged to console: Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> GP extension could not be loaded: Extension (platform: 2) could not be found (in update) dealloc Query controller [C32BA176-6A3E-465D-B3C5-0F8D91068B89] ImagePlaygroundViewController.isAvailable returns true, however. In a "real" Mac Catalyst app, it's working. Just not when the app is actually an iPad app. Is this a bug?
3
2
656
Dec ’24
sourceImageURL in imagePlaygroundSheet isn't optional
I can't shake the "I don't think I did this correctly" feeling about a change I'm making for Image Playground support. When you create an image via an Image Playground sheet it returns a URL pointing to where the image is temporarily stored. Just like the Image Playground app I want the user to be able to decide to edit that image more. The Image Playground sheet lets you pass in a source URL for an image to start with, which is perfect because I could pass in the URL of that temp image. But the URL is NOT optional. So what do I populate it with when the user is starting from scratch? A friendly AI told me to use URL(string: "")! but that crashes when it gets forced unwrapped. URL(string: "about:blank")! seems to work in that it is ignored (and doesn't crash) when I have the user create the initial image (that shouldn't have a source image). This feels super clunky to me. Am I overlooking something?
1
0
252
2w
Image Playground Error: Cannot find protocol declaration for 'ImageGenerationViewControllerDelegate'
@available(macCatalyst 18.1, *) @available(iOS 18.1, *) extension CKImageSelectionManager: ImagePlaygroundViewController.Delegate { public func imagePlaygroundViewController(_ imagePlaygroundViewController: ImagePlaygroundViewController, didCreateImageAt imageURL: URL) { } func presentImagePlayground() { let imagePlaygroundVC = ImagePlaygroundViewController() // Set delegate to self to receive the callback imagePlaygroundVC.delegate = self imagePlaygroundVC.isModalInPresentation = true // Prevents dismissal with swipe if needed self.delegate?.presentImageSelectionViewController(imagePlaygroundVC) } } This generates an error in the xcode generated swift header.
3
0
623
Dec ’24