I‘m excited at the development possibilities presented by Apple Intelligence and have begun imagining retrieval augmented generation use cases. Writing tools suggest that this is possible, but I have not seen any direct statements by Apple regarding use of AFMs for RAG applications. Have any references to APIs or sample code for RAG applications been published?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Post
Replies
Boosts
Views
Activity
Can vector embeddings be used in a SwiftData Model?
If yes, are there resources available to learn more about it? or at least a guided step on how to make it work?
Every time I click join the waitlist, there's is no reaction.
My phone just redirects me back to this page
I have apple intelligence working fine with all the features on my iPhone 16 pro max. except for the summary feature and email summarizing, they don't work unless I turn on VPN on. why is that ? I am not living in the EU. and all the features of AI are working except for this one it won't work unless I have my vpn on.
can anyone help me with this.
why it’s taking too long to download appl intelligence on iPhone 15 pro max .
how long it will take to download.
I use 5G now it counting 4 weeks still downloading.
Why does Apple sell Iphones 16 with AI that won’t work in europe? You do not sell a car without a engine. Was it not better to wait to bring it to europe?
I don’t see the Apple Intelligence tap, even with iOS 18.1, language and region United States, Sri in English US… what can I do?
Hey guys,
If you are still having issues on having your download stuck at 99% and switching on and off WiFi, I believe I have a fix.
turn your cellular data completely off. It will force the device to just use wifi.
Then the download will restart and actually complete.
What are the possible KPI requirements set by Apple AI for cellular networks, e.g. regarding latency, throughput or jitter?
What is the expected effect on iPhone energy consumption?
Trying to experiment with Genmoji per the WWDC documentation and samples, but I don't seem to get Genmoji keyboard.
I see this error in my log:
Received port for identifier response: <(null)> with error:Error Domain=RBSServiceErrorDomain Code=1 "Client not entitled" UserInfo={RBSEntitlement=com.apple.runningboard.process-state,
NSLocalizedFailureReason=Client not entitled, RBSPermanent=false}
elapsedCPUTimeForFrontBoard couldn't generate a task port
Is anything presently supported for developers? All I have done here is a simple app with a UITextView and code for:
textView.supportsAdaptiveImageGlyph = true
Any thoughts?
I’m currently developing an app that features a main view with a UITableView. When users select a row, they are navigated to a detail view that contains a UITextField. This UITextField already supports Writing Tools.
My question is: When a user long-presses a UITableView cell, is it possible to add a Writing Tools option to the Context Menu, allowing users to interact with the Writing Tools more conveniently?like Summary detail text
I'm trying to disable Writing Tools for a specific TextField using .writingToolsBehavior(.disabled), but when running the app on my iPhone 16 Pro with Apple Intelligence enabled, I can still use Writing Tools on the text box. I also see no difference with .writingToolsBehavior(.limited).
Is there something I'm doing wrong or is this a bug?
Sample code below:
import SwiftUI
struct ContentView: View {
@State var text = ""
var body: some View {
VStack {
TextField("Enter Text", text: $text)
.writingToolsBehavior(.disabled)
}
.padding()
}
}
#Preview {
ContentView()
}
I signed up for apple intelligence on my IPad Air m1 and then updated my phone today. It tells me that I’ve already been put on a waitlist i didn’t even join? And it’s been stuck on that for 2 days now
i have iphone 15 pro max on ios 18.1 I tried to join Apple Intelligence about three weeks ago, but I'm still stuck on the waitlist. I've already tried everything recommended by Apple, including changing my region and Siri's language to English (US). Can anyone help me figure out how to solve this issue?
Xcode Version 16.0 (16A242d)
iOS18 - Swift
There seems to be a behavior change on iOS18 when using AppShortcuts and AppIntents to pass string parameters. After Siri prompts for a string property requestValueDialog, if the user makes a statement the string is passed. If the user's statement is a question, however, the string is not sent to the AppIntent and instead Siri attempts to answer that question.
Example Code:
struct MyAppNameShortcuts: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskQuestionIntent(),
phrases: [
"Ask \(.applicationName) a question",
]
)
}
}
struct AskQuestionIntent: AppIntent {
static var title: LocalizedStringResource = .init(stringLiteral: "Ask a question")
static var openAppWhenRun: Bool = false
static var parameterSummary: some ParameterSummary {
Summary("Search for \(\.$query)")
}
@Dependency
private var apiClient: MockApiClient
@Parameter(title: "Query", requestValueDialog: .init(stringLiteral: "What would you like to ask?"))
var query: String
// perform is not called if user asks a question such as "What color is the moon?" in response to requestValueDialog
// iOS 17, the same string is passed though
@MainActor
func perform() async throws -> some IntentResult & ProvidesDialog & ShowsSnippetView {
print("Query is: \(query)")
let queryResult = try await apiClient.askQuery(queryString: query)
let dialog = IntentDialog(
full: .init(stringLiteral: queryResult.answer),
supporting: .init(stringLiteral: "The answer to \(queryResult.question) is...")
)
let view = SiriAnswerView(queryResult: queryResult)
return .result(dialog: dialog, view: view)
}
}
Given the above mock code:
iOS17:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
String of "What color is the moon?" is passed to the AppIntent
iOS18:
Hey Siri
Ask (AppName) a question
Siri responds "What would you like to ask?"
Say "What color is the moon?"
Siri answers the question "What color is the moon?"
Follow above steps again and instead reply "Moon"
"Moon" is passed to AppIntent
Basically any interrogative string parameters seem to be intercepted and sent to Siri proper rather than the provided AppIntent in iOS 18
Yesterday after updating to iOS 18.1 I joined the Apple Intelligence waitlist on my iPhone 15 Pro. About an hour later I noticed that it had the message "Support for processing Apple Intelligence on device is downloading." A day later it is still displaying the same message. I have strong wi-fi, I'm plugged in to power with full battery, and there are 750gb available in storage. From what I have been able to find online, this isn't the typical user experience and that it probably isn't going to complete the process at this point. Any advice on how to proceed and get Apple Intelligence installed and working would be greatly appreciated.
I just updated to 18.1 beta 4 and the only AI features I can use are the new Siri and photo cleanup. The rest are unavailable. Apple intelligence takes up 589mb on my iphone
hey just curious if apple intelligence will be available on iPhone 15 Plus as well??? in october or is there a way that iPhone 15 Plus owners can join apple intelligence’s wait lists or something??? please let me know !😫
I need to add AI Image Playground in my iOS app with UIKit, as per WWDC 2024 introduce new AI Image Playground API, I didn't find any official document yet, So how can add it ?
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app.
Is this... possible with app domains as they are documented?
I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?