Hi everyone,
I was wondering, on how accurate is the Hand Classification ML? For Example: Is it possible to understand the different letters of the Sign Language Alphabet or is it only capable of recognizing simple poses like a thumbs up?
General
RSS for tagExplore the power of machine learning within apps. Discuss integrating machine learning features, share best practices, and explore the possibilities for your app.
Post
Replies
Boosts
Views
Activity
The documentation for translationTask(source:target:action:) says it should translate when content appears, but this isn't happening. I’m only able to translate when I manually associate that task with a configuration, and instantiate the configuration.
Here’s the complete source code:
import SwiftUI
import Translation
struct ContentView: View {
@State private var originalText = "The orange fox jumps over the lazy dog"
@State private var translationTaskResult = ""
@State private var translationTaskResult2 = ""
@State private var configuration: TranslationSession.Configuration?
var body: some View {
List {
// THIS DOES NOT WORK
Section {
Text(translationTaskResult)
.translationTask { session in
Task { @MainActor in
do {
let response = try await session.translate(originalText)
translationTaskResult = response.targetText
} catch { print(error) }
}
}
}
// THIS WORKS
Section {
Text(translationTaskResult2)
.translationTask(configuration) { session in
Task { @MainActor in
do {
let response = try await session.translate(originalText)
translationTaskResult2 = response.targetText
} catch { print(error) }
}
}
Button(action: {
if configuration == nil {
configuration = TranslationSession.Configuration()
return
}
configuration?.invalidate()
}) { Text("Translate") }
}
}
}
}
How can I automatically translate a given text when it appears using the new translationTask API?
iOS 18 App Intents while supporting iOS 17
Hello,
I have an existing app that supports iOS 17. I already have three App Intents but would like to add some of the new iOS 18 app intents like ShowInAppSearchResultsIntent.
However, I am having a hard time using #available or @available to limit this ShowInAppSearchResultsIntent to iOS 18 only while still supporting iOS 17.
Obviously, the ShowInAppSearchResultsIntent needs to use @AssistantIntent which is iOS 18 only, so I mark that struct as @available(iOS 18, *). That works as expected. It is when I need to add this "SearchSnippetIntent" intent to the AppShortcutsProvider, that I begin to have trouble doing. See code below:
struct SnippetsShortcutsAppShortcutsProvider: AppShortcutsProvider {
@AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
//iOS 17+
AppShortcut(intent: SnippetsNewSnippetShortcutsAppIntent(), phrases: [
"Create a New Snippet in \(.applicationName) Studio",
], shortTitle: "New Snippet", systemImageName: "rectangle.fill.on.rectangle.angled.fill")
AppShortcut(intent: SnippetsNewLanguageShortcutsAppIntent(), phrases: [
"Create a New Language in \(.applicationName) Studio",
], shortTitle: "New Language", systemImageName: "curlybraces")
AppShortcut(intent: SnippetsNewTagShortcutsAppIntent(), phrases: [
"Create a New Tag in \(.applicationName) Studio",
], shortTitle: "New Tag", systemImageName: "tag.fill")
//iOS 18 Only
AppShortcut(intent: SearchSnippetIntent(), phrases: [
"Search \(.applicationName) Studio",
"Search \(.applicationName)"
], shortTitle: "Search", systemImageName: "magnifyingglass")
}
let shortcutTileColor: ShortcutTileColor = .blue
}
The iOS 18 Only AppShortcut shows the following error but none of the options seem to work. Maybe I am going about it the wrong way.
'SearchSnippetIntent' is only available in iOS 18 or newer
Add 'if #available' version check
Add @available attribute to enclosing static property
Add @available attribute to enclosing struct
Thanks in advance for your help.
Using App Shortcuts with app intents, Siri only responds to the first shortcut defined in the app shortcut below.
struct MementoShortcuts: AppShortcutsProvider {
u/AppShortcutsBuilder
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: SaveLinkIntent(),
phrases: ["Add a link to \(.applicationName)", "Add \(\.$url) to \(.applicationName)", "Make a new link in \(.applicationName)", "Create a new link in \(.applicationName) from \(\.$url)"],
shortTitle: "Add Link",
systemImageName: "link.badge.plus"
)
AppShortcut(
intent: LinkViewedIntent(),
phrases: [
"Mark a link I saved in \(.applicationName) as viewed",
"Mark \(\.$link) as viewed in \(.applicationName)",
"Set link in \(.applicationName) to viewed",
"Change status of \(\.$link) to viewed in \(.applicationName)",
],
shortTitle: "Mark Link as Viewed",
systemImageName: "book"
)
}
}
I have tried switching the order and she always uses the one that comes first. Both show up in the shortcuts app as an app shortcut, but only one shortcut is recognized by Siri even if I say the other one's phrase.
I am very new to App Intents and I am trying to add them to my On Device LLM ChatBot app so my users can get answers to any questions anywhere in iOS.
I have the following code and it is working wonderfully in the Shortcuts app.
import AppIntents
struct AskAi: AppIntent {
static var openAppWhenRun: Bool = false
static let title: LocalizedStringResource = "Ask Ai About"
static let description = "Gets an answer from Ai for your question."
@Parameter(title: "Question")
var question: String
static var parameterSummary: some ParameterSummary {
Summary("Ask Ai About \(\.$question)")
}
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<String> {
let bot: Bot = Bot()
await bot.respond(to: self.question)
return .result(
value: bot.output
)
}
}
class AppShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: AskAi(),
phrases: [
"Ask \(.applicationName) \(\.$question)",
"Get \(.applicationName) answer for \(\.$question)",
"Open \(\.$question) using \(.applicationName) ",
"Using \(.applicationName) get help with \(\.$question)"
],
shortTitle: "Ask Ai",
systemImageName: "sparkles"
)
}
}
I can create a shortcut for this AppIntent and that allows me say speak the response.
I can call my shortcut via iOS 18 Beta 1 by the Shortcut name I set in the Shortcuts app and that allows it to work.
It does not work at all by just Asking Siri any of the phrases I have defined.
The info.plist has an app name alias defined just to be sure.
I even added the Siri capability in Xcode-beta.
I also tried using the ProvidesDialog return type too.
Whatever I do the AppIntent is invisible to Siri.
Siri tries to search the web, looking for my app name in the contacts or have an error Apple Cash which has nothing to do with what I was talking about.
Is there anything else I am missing for setting up iOS AppIntents to work with Siri?
I'm playing with the new Vision API for iOS18, specifically with the new CalculateImageAestheticsScoresRequest API.
When I try to perform the image observation request I get this error:
internalError("Error Domain=NSOSStatusErrorDomain Code=-1 \"Failed to create espresso context.\" UserInfo={NSLocalizedDescription=Failed to create espresso context.}")
The code is pretty straightforward:
if let image = image {
let request = CalculateImageAestheticsScoresRequest()
Task {
do {
let cgImg = image.cgImage!
let observations = try await request.perform(on: cgImg)
let description = observations.description
let score = observations.overallScore
print(description)
print(score)
} catch {
print(error)
}
}
}
I'm running it on a M2 using the simulator.
Is it a bug? What's wrong?
When in Safari, you can say something like, "Siri, text this link to mom" or "Siri, save this link to reminders" and it will do it with the currently viewed link. Shortcuts also has a "Get what's on screen" action that can be added. How do I expose the user's current context to my App Intent?
I consistently receive corrupted results from tf.signal.fft3d() when it is within a function that has a @tf.function decorator. The results are all zero (0.) for entries after a certain x index (see image). Surprisingly, the issue depends on the matrix size. For example, (1023, 1023, 287) works but (1023, 1023, 575) does not. The issue is problematic because it occurs silently and not for all matrix sizes, i.e. can easily slip through tests.
The error occurs only when tensorflow-metal is installed. The Tensorflow version is 2.16.1. My hardware is a Macbook Pro M3 Max with 40 GPU cores, 128 GB RAM running MacOS Sonoma version 14.5 (23F79). A Python environment to reproduce the bug can be created as follows:
conda create --name tfmetalbug python=3.11.9
conda activate tfmetalbug
pip install tensorflow tensorflow-metal
conda install matplotlib
The following code reproduces the issue:
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
# Wrap fft3d with tf.function
@tf.function
def fft3d_wrapper_function(x):
return tf.signal.fft3d(x)
# Generate a 3D image
img = tf.random.normal(shape=(1023, 1023, 575), stddev=1., dtype=float) # generate random 3d image
img = tf.dtypes.cast(img, tf.complex64) # convert to complex values
# Compute the 3D FFT
img_fft = fft3d_wrapper_function(img)
# Visualize the 3D FFT
plt.imshow(np.real(img_fft)[:, img_fft.shape[1]//2+10, :], cmap="gray", vmin=-0.001, vmax=0.001)
plt.savefig("fft3d_wrapper_function.png")
For me, removing the @tf.function decorator has resolved the issue.
Hei Sir is not working in ios beta 18
Hellooo,
I’m looking to implement an OpenAI assistant using APIs but I want to do this locally on a group of files.
I want to be able to train a GPT on the contents of a folder for example.
Does anyone have any experience in this?
It seems OpenAI needs a lot of uploading on each request if I were to do this with their API after playing around (but this feels like I’m missing something). It’s also quite costly to use.
I was hoping to use local machine learning and models but this is quite limited in what it can do (eg Lumachain)
Hello. Where can I find some examples on creating custom genmojis in Swift and reusing it in an App?
My App has several resources that I'd like to spring open through App Intents. For example a series of Dictionaries. These resources however in the app are behind a log in (for security) and are entitlements that are purchased. They may own 4 of 7 dictionaries.
If I want to have an intent that says, "Open Dictionary: (Dict Name)" how do I best handle situations where the user may no longer be logged in or have the entitlement for that specific dictionary?
Thanks
The Translation API introduced at Session 10117 is impressive, but limiting it to SwiftUI is restrictive.
This API works great in the demo, but for more complex apps, it lacks flexibility because it is bound to SwiftUI Views.
Please consider making it available in non-SwiftUI environments.
After watching the What's new in App Intents session I'm attempting to create an intent conforming to URLRepresentableIntent. The video states that so long as my AppEntity conforms to URLRepresentableEntity I should not have to provide a perform method . My application will be launched automatically and passed the appropriate URL.
This seems to work in that my application is launched and is passed a URL, but the URL is in the form: FeatureEntity/{id}.
Am I missing something, or is there a trick that enables it to pass along the URL specified in the AppEntity itself?
struct MyExampleIntent: OpenIntent, URLRepresentableIntent {
static let title: LocalizedStringResource = "Open Feature"
static var parameterSummary: some ParameterSummary {
Summary("Open \(\.$target)")
}
@Parameter(title: "My feature", description: "The feature to open.")
var target: FeatureEntity
}
struct FeatureEntity: AppEntity {
// ...
}
extension FeatureEntity: URLRepresentableEntity {
static var urlRepresentation: URLRepresentation {
"https://myurl.com/\(.id)"
}
}
Are there going to be any sessions on Image Playgrounds API for iOS?
"Explore machine learning on Apple platforms" mentions the writing and points to sessions, but only mentions Image Playground without pointing to sessions.
iOS 18 adds a specific macro for exposing your search app intent, app entities, etc, to siri but how are you meant to add it to your existing objects without removing it entirely from < iOS 18 users?
For example, i get the following error:
AssistantIntent(schema:) is only available in iOS 18 or newer. Add @available attribute to enclosing struct.
I don't want to do that since i still want to support iOS 17 users with my existing shortcuts. Do i need to duplicate my entire shortcuts model to add the new macro?
I am trying to make a voip car play app using siri
let assistant = CPAssistantCellConfiguration(position: .top, visibility: .always, assistantAction: .startCall)
let siriTmeplate = CPListTemplate(title: "Siri", sections: [sectionItems, loadingSection], assistantCellConfiguration: assistant)
siriTmeplate.tabSystemItem = .recents
siriTmeplate.showsTabBadge = false
Using the above code gives me the error
"Error: Intent of type INStartCallIntent is not supported for this app category"
on app luanch
I have INStartCallIntent in my apps info plist and I have all the entitlements and I have "business" as the app category,
I can fine 0 help online with this. what does this error really mean and how can I fix it please
I am developing an iOS app that supports INPlayMediaIntent.
We are trying to increase the recognition rate of content names, which are song titles, using AppIntentVocabulary.
As a sample, some extracts are shown below.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>IntentPhrases</key>
<array>
<dict>
<key>IntentName</key>
<string>INPlayMediaIntent</string>
<key>IntentExamples</key>
<array>
<string>Mezamashi Appで湖畔の朝を再生</string>
<string>湖畔の朝をMezamashi Appで再生して</string>
</array>
</dict>
</array>
<key>ParameterVocabularies</key>
<array>
<dict>
<key>ParameterNames</key>
<array>
<string>INPlayMediaIntent.playlistTitle</string>
</array>
<key>ParameterVocabulary</key>
<array>
<dict>
<key>VocabularyItemIdentifier</key>
<string>ID1</string>
<key>VocabularyItemSynonyms</key>
<array>
<dict>
<key>VocabularyItemPronunciation</key>
<string>aogamagaeru</string>
<key>VocabularyItemPhrase</key>
<string>青ガマガエル</string>
</dict>
</array>
</dict>
<dict>
<key>VocabularyItemIdentifier</key>
<string>ID2</string>
<key>VocabularyItemSynonyms</key>
<array>
<dict>
<key>VocabularyItemPronunciation</key>
<string>kohon no asa</string>
<key>VocabularyItemPhrase</key>
<string>湖畔の朝</string>
</dict>
</array>
</dict>
<dict>
<key>VocabularyItemIdentifier</key>
<string>ID3</string>
<key>VocabularyItemSynonyms</key>
<array>
<dict>
<key>VocabularyItemPronunciation</key>
<string>kumageratachi no uta</string>
<key>VocabularyItemPhrase</key>
<string>クマゲラたちの歌</string>
</dict>
</array>
</dict>
</array>
</dict>
</array>
</dict>
</plist>
When running on the iOS 17.5 simulator in Xcode 15.4, the results are as follows.
mediaName = VocabularyItemIdentifier
mediaIdentifier = nil
<INMediaSearch: 0x6000026212c0> {
reference = 0;
mediaType = 0;
sortOrder = 0;
albumName = <null>;
mediaName = ID1;
genreNames = (
);
artistName = <null>;
moodNames = (
);
releaseDate = <null>;
mediaIdentifier = <null>;
}
However, when running on an iOS 17.5 device, the following applies.
mediaName = VocabularyItemPhrase
mediaIdentifier = VocabularyItemIdentifier
<INMediaSearch: 0x301efd9e0> {
reference = 0;
mediaType = 5;
sortOrder = 0;
albumName = <null>;
mediaName = 青ガマガエル;
genreNames = (
);
artistName = <null>;
moodNames = (
);
releaseDate = <null>;
mediaIdentifier = ID1;
}
The results are not stable, for example, sometimes everything else returns null.
I have tried everything, but it is just taking a long time.
Does anyone have any advice on this?
From https://www.apple.com/newsroom/2024/06/introducing-apple-intelligence-for-iphone-ipad-and-mac/:
Powered by Apple Intelligence, Siri becomes more deeply integrated into the system experience. With richer language-understanding capabilities, Siri is more natural, more contextually relevant, and more personal, with the ability to simplify and accelerate everyday tasks.
From https://developer.apple.com/apple-intelligence/:
Siri is more natural, more personal, and more deeply integrated into the system. Apple Intelligence provides Siri with enhanced action capabilities, and developers can take advantage of pre-defined and pre-trained App Intents across a range of domains to not only give Siri the ability to take actions in your app, but to make your app’s actions more discoverable in places like Spotlight, the Shortcuts app, Control Center, and more. SiriKit adopters will benefit from Siri’s enhanced conversational capabilities with no additional work. And with App Entities, Siri can understand content from your app and provide users with information from your app from anywhere in the system.
Based on this, as well as the video at https://developer.apple.com/videos/play/wwdc2024/10133/ , my understanding is that in order for Siri to be able to execute tasks in applications, those applications must implement the Siri Intents API.
Can someone at Apple please clarify: will it be possible for Siri or some other aspect of Apple Intelligence / Core ML / Create ML to take actions in applications which do not support these APIs (e.g. web apps, Citrix apps, legacy apps)?
Thank you!
I have a Shortcuts action via an App Intent that I want only for active subscribers to use.
I have a shared class that handles all the subcription related things. But for some reason my code only works if the app is active in the background. Once the app is quitted and the user performs the Shortcut, the not subscribed error is thrown – even though the user is subscribed.
How can I ensure that my subscription check is done correctly, if the app isn’t open in the background?
My Code
App Intent excerpt:
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<MeterIntentEntity> {
// Validate that the user is subscribed.
// Cancels action with error message if not subscribed.
if SubscriptionManager.shared.userIsSubscribed == false {
throw IntentError.notSubscribed
}
// More Code …
// Finish and pass created value as result.
return .result(value: something)
}
Subscription Manager excerpt:
class SubscriptionManager: ObservableObject {
// A singleton for our entire app to use
static let shared = SubscriptionManager()
let productIds = ["my_sub1", "my_sub2"]
@Published private(set) var availableSubscriptions: [Product]
@Published private(set) var purchasedSubscriptions: [Product] = []
public var userIsSubscribed: Bool {
return !self.purchasedSubscriptions.isEmpty
}
init() {
// Initialize empty products, and then do a product request asynchronously to fill them in.
availableSubscriptions = []
Task {
await updatePurchasedProducts()
}
}
@MainActor
func updatePurchasedProducts() async {
for await result in Transaction.currentEntitlements {
do {
let transaction = try checkVerified(result)
if let subscription = availableSubscriptions.first(where: { $0.id == transaction.productID }) {
purchasedSubscriptions.append(subscription)
}
} catch {
Logger.subscription.error("Error loading users user's purchased products.")
}
}
}