I tried this:
struct CarShortcutsProvider: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: LockCarIntent(),
phrases: ["Lock my car with \(.applicationName)", "Lock my \(\.$car) with \(.applicationName)"],
shortTitle: LocalizedStringResource("Lock Car"),
systemImageName: "lock.fill"
)
AppShortcut(
intent: UnlockCarIntent(),
phrases: ["Unlock my car with \(.applicationName)", "Unlock my \(\.$car) with \(.applicationName)"],
shortTitle: LocalizedStringResource("Unlock Car"),
systemImageName: "lock.open.fill"
)
}
}
but Siri only understands "unlock my car ", not with the placeholder. Siri asks me then for the car, and it understands it, but not in one sentence. Is there something wrong with my code?
Also I tried it without applicationName first, and then it didn't work at all with Siri. Is this a general limitation of app intents? I thought the goal was to reduce friction. If the user has to mention the app name all the time, it adds friction.
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
Itself been 4-5 days my Image playground has showing the “Downloading Support for Image Playground “
Hey, has anyone figured out how the “Persons” list in Genmoji/Playground actually works?
I’ve had a strange experience so far. When I first got access during Beta 2, the list randomly included about 10–15 people, even though my photo library contains many more recognizable faces. To try fixing this, I started naming faces in the Photos app, hoping they’d be added to the Genmoji/Playground list, but nothing changed.
Then, after updating to Beta 3, it added just 2–3 of the people I had named. Encouraged, I spent about an hour naming all the faces in my library. But a few hours later, the list unexpectedly removed around 10 people, leaving me with fewer than I had initially.
I’ve also read that leaving the phone locked and plugged into power should help sort people in the library, but that hasn’t worked for me yet.
Anyone else experienced this or found a way to make it work? Thanks!
Hi everyone,
Could someone confirm if it's currently possible, or if there are any plans, to restrict users from enabling Apple Intelligence altogether?
I understand that we can block individual features using MDM, but I'm interested in knowing if we can prevent users from toggling Apple Intelligence on and off in System Settings entirely.
Thanks!
Kind Regards,
Filipe Nogueira
I noticed that the ChatGPT is listed as an "Extension" in the Apple Intelligence settings on iOS 18.2 beta. Does this mean developers will be able to create their own extensions? Or will this be limited to larger companies to incorporate their own models into Apple Intelligence?
Hello, we're investigating an option to disable writing tools for some customers in our app. I'm aware of the writingToolsBehavior property for UITextView etc, but we would like to have a way to set this globally without having to update all UITextView instances (or future instances). Is there any API to do this?
We tried using UITextView.appearance.writingToolsBehavior = .none and it seemed promising on 18.2 beta, however it introduced crashes on devices running 18.1.
The crashes look like:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Have you sent -setWritingToolsBehavior: to <UITextView: 0x14462c000; frame = (0 0; 0 0); text = ''; userInteractionEnabled = NO; gestureRecognizers = <NSArray: 0x30067cb40>; backgroundColor = UIExtendedGrayColorSpace 0 0; layer = <CALayer: 0x3009b1ba0>; contentOffset: {0, 0}; contentSize: {0, 0}; adjustedContentInset: {0, 0, 0, 0}> off the main thread? To verify, look for a complaint in the logs: "Unsupported use of UIKit…", and fix the problem if you find it. If your use is main-thread only please file a radar on UIKit, and attach this log. exercisedImplementations = {
"setWritingToolsBehavior:" = (
);
}'
Similarly, even on 18.2 beta if we used UITextField.appearance.writingToolsBehavior = .none we would see crashes for any search fields like:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Have you sent -setWritingToolsBehavior: to <UISearchBarTextField: 0x141c04a00; frame = (0 0; 0 0); text = ''; opaque = NO; gestureRecognizers = <NSArray: 0x301fe15c0>; placeholder = Search Leads; borderStyle = RoundedRect; background = <_UITextFieldSystemBackgroundProvider: 0x3015de960: backgroundView=<_UISearchBarSearchFieldBackgroundView: 0x141c60200; frame = (0 0; 0 0); opaque = NO; autoresize = W+H; userInteractionEnabled = NO; layer = <CALayer: 0x3015de8e0>>, fillColor=(null), textfield=<UISearchBarTextField: 0x141c04a00>>; layer = <CALayer: 0x3015de240>> off the main thread? To verify, look for a complaint in the logs: "Unsupported use of UIKit…", and fix the problem if you find it. If your use is main-thread only please file a radar on UIKit, and attach this log. exercisedImplementations = {
"setWritingToolsBehavior:" = (
);
}'
Is it possible to set this globally?
My Playground app has been stuck at Downloading support for Image Playground now for about 4-5 days. I've reinstalled the app, restarted phone multiple times, reset network settings 2-3 times. Tried LTE/5G and Different WIFI networks and it still sat showing Downloading Support.
Anyone else having this issue?
How long does it usually take to get access to image playground. Its been about a week since I got IOS 18.2 public beta and still am waiting for access to the image playground. When I got apple intelligence only took a few hours.
When I pressed an early access a few days ago and when I check it it still says we will notify you when it is ready can apple please fix this problem with image playground
I Got Access and Then I opened the App and it said still waiting I was very sad
Woke up to a notification saying playground, Genmoji…etc was ready. but every time I try to use it says early access was requested. Anyone else had this issue? if so how did you fix it?
I would like to explore the option of restricting system prompts to share information with ChatGPT on my company's laptops. Is this possible, and if so, how can it be implemented? Should this be centralized, or should each user handle it individually?
I have been on the Bata 18.2 since it launched and I can not seem to get the playground keeps saying I’ve requested. Why! Please help
Apple intelligence not working at all. Was working ok under 18.1 but nothing under 18.2??
It been hours still no acess to image playground
I have been attempting to achieve the early access image creation (iPad iOS 18.2) and it has taken days, I still don’t have it and feel I never will. Someone heeelelllppppp!
I am working to add Spotlight indexing for my app entities as discussed in WWDC24's video "What's New in App Intents".
That video goes over the IndexedEntity protocol and the integration with Spotlight via CSSearchableItemAttributeSet.
What I'm seeing though does not match the video. In the video, the presenter goes through the sort of progressive approach you can take to getting this data into Spotlight starting with the basics and then expanding to include more support depending on how much the developer wants to do.
What I'm seeing is that if you conform to IndexedEntity, your entities will appear in Spotlight using the name derived from
public var displayRepresentation: DisplayRepresentation
So, that works. Name appears... BUT the next part of the video goes into how to expand your implementation with more metadata for Spotlight via CSSearchableItemAttributeSet. The issue I'm seeing is that once that's implemented, the items disappear from Spotlight, almost like that implementation is overriding the base implementation in a way that no longer functions.
My expectation is that an item with custom attributes would use them in Spotlight as appropriate, not disappear from search, i.e. what's shown in the video should work.
I've got a sample project here:
https://hanchor.s3.amazonaws.com/misc/IndexingTest.zip
To reproduce with the sample:
Build and run. Indexing is setup in the init() method so it will just run.
Go to Spotlight and search for 'Huntersblau', a string included in the content set. At this point you should see a result - good!
Stop the app and go back and uncomment the var attributeSet: CSSearchableItemAttributeSet implementation in IndexingTestApp.swift. This will provide custom attributes to Spotlight.
Repeat steps 1 and 2 - you'll see now, it no longer appears in the search results - when CSSearchableItemAttributeSet is implemented, the item drops out of Spotlight.
Image Playground Error: Cannot find protocol declaration for 'ImageGenerationViewControllerDelegate'
@available(macCatalyst 18.1, *)
@available(iOS 18.1, *)
extension CKImageSelectionManager: ImagePlaygroundViewController.Delegate {
public func imagePlaygroundViewController(_ imagePlaygroundViewController: ImagePlaygroundViewController, didCreateImageAt imageURL: URL) {
}
func presentImagePlayground() {
let imagePlaygroundVC = ImagePlaygroundViewController()
// Set delegate to self to receive the callback
imagePlaygroundVC.delegate = self
imagePlaygroundVC.isModalInPresentation = true // Prevents dismissal with swipe if needed
self.delegate?.presentImageSelectionViewController(imagePlaygroundVC)
}
}
This generates an error in the xcode generated swift header.
Before I updated to 18.2, I was missing Apple Intelligence features such as chatgpt integration. Now that I updated to 18.2. I’m still missing Apple Intelligence and ChatGPT integration altogether and now my Siri turned into the old Siri. What can I do?
I updated to iOS 18.2 beta 2, and with it, I am able to see behind the early access requested by just swiping down, but when I do swipe down, the app closes. Does this mean I'm close to getting access?