Apple Intelligence is now available in macOS 15.1 and iOS 18.1 (Beta). However, it is currently not supported on visionOS. Despite the fact that visionOS runs on M2 silicon with 16GB of unified memory, Apple Intelligence cannot be utilized on this platform. I want enable Apple Intelligence for my visionOS app.
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Posts under Apple Intelligence tag
73 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm trying to cast the error thrown by TranslationSession.translations(from:) as Translation.TranslationError. However, the app crashes at runtime whenever Translation.TranslationError is used in the project.
Environment:
iOS Version: 18.1 beta
Xcode Version: 16 beta
yld[14615]: Symbol not found: _$s11Translation0A5ErrorVMa
Referenced from: <3426152D-A738-30C1-8F06-47D2C6A1B75B> /private/var/containers/Bundle/Application/043A25BC-E53E-4B28-B71A-C21F77C0D76D/TranslationAPI.app/TranslationAPI.debug.dylib
Expected in: /System/Library/Frameworks/Translation.framework/Translation
I installed the MacOS 15.1 at my Macbook Pro 14
M2 Pro, but apple intelligence doesnt appear
Hello Everyone,
I want to get into the Apple Intelligence space where I work on LLM and other AI models that can run on the edge. I am en route to get a MAC for this work. but till then can do any development with another non-apple laptop. I want to contribute to the work and is there any way that I can do it till then? I have a NVIDIA compatible laptop for other dev purposes with Windows.
I'm not comfortable yet installing the iOS beta on my iPhone 15 - is it possible to play with Apple Intelligence in the simulator, or is it on-device only?
Hello everyone,
I am reaching out because (as you know) Apple recently sent out an email about an upcoming in-person event in Cupertino. I purchased my plane ticket and went to reserve my spot by August 9th (as requested) and I was unable to do so. I reached out to support but have not heard anything back.
I'm trying to test the Assistant Schema Intent with the Shortcut app. However, unlike in the WWDC video (https://developer.apple.com/videos/play/wwdc2024/10133/), my Intent conforming to the Assistant Schema does not appear when I search for 'AssistantSchema'. If it doesn't appear here, does that mean this intent will not work with Siri?
I need to show the usage of AI to pass beta app review, but as i understand AI still not available for everyone. Also i have iPhone 12, where AI features would not be supported at all. Also my game made in Unity Engine doesn't even have anything to summarise. What am i supposed to do?
I'm using beta 2 of Xcode 16 on an M1 MacBook with 32 GB of memory, running macOS 15 beta 2. It didn't appear that predictive code completion was working as exhibited in the developer videos, so I tried to figure out what's going on. The Xcode documentation mentions that you can disable predictive code completion in Settings, so I checked there. The checkbox to turn it on is disabled. When I click the "I" button to its right, it tells me that "Predictive code completion is not supported in this region." I am in the US with my system set to US English. What do I have to do to be able to experience this feature?
As per subject. Is this possible at all?
The WWDC video only talks about the communication / push notifications. There doesn't seem to be any update to UNNotificationContent to allow setting of attributed strings.
If your app makes use of communication notifications you can even include Genmoji and other image glyphs in your notifications with the new "UNNotificationAttributed MessageContext API". For push notifications, the payload just needs to contain a rich text representation that may contain image glyphs.
We recommend that you use a Notification Service Extension to parse the rich text, download assets, create the attributed body and update the notification content
IIRC, the new photos highlights videos pick your best photos using Apple Intelligence.
Is there an API or metadata available to use AI and get the best photo out of e.g. 5 similar photos?
iOS 18 adds a specific macro for exposing your search app intent, app entities, etc, to siri but how are you meant to add it to your existing objects without removing it entirely from < iOS 18 users?
For example, i get the following error:
AssistantIntent(schema:) is only available in iOS 18 or newer. Add @available attribute to enclosing struct.
I don't want to do that since i still want to support iOS 17 users with my existing shortcuts. Do i need to duplicate my entire shortcuts model to add the new macro?
From https://www.apple.com/newsroom/2024/06/introducing-apple-intelligence-for-iphone-ipad-and-mac/:
Powered by Apple Intelligence, Siri becomes more deeply integrated into the system experience. With richer language-understanding capabilities, Siri is more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks.
From https://developer.apple.com/apple-intelligence/:
Siri is more natural, more personal, and more deeply integrated into the system. Apple Intelligence provides Siri with enhanced action capabilities, and developers can take advantage of pre-defined and pre-trained App Intents across a range of domains to not only give Siri the ability to take actions in your app, but to make your app’s actions more discoverable in places like Spotlight, the Shortcuts app, Control Center, and more. SiriKit adopters will benefit from Siri’s enhanced conversational capabilities with no additional work. And with App Entities, Siri can understand content from your app and provide users with information from your app from anywhere in the system.
Based on this, as well as the video at https://developer.apple.com/videos/play/wwdc2024/10133/ , my understanding is that in order for Siri to be able to execute tasks in applications, those applications must implement the Siri Intents API.
Can someone at Apple please clarify: will it be possible for Siri or some other aspect of Apple Intelligence / Core ML / Create ML to take actions in applications which do not support these APIs (e.g. web apps, Citrix apps, legacy apps)?
Thank you!