Explore the power of machine learning within apps. Discuss integrating machine learning features, share best practices, and explore the possibilities for your app.

Post

Replies

Boosts

Views

Activity

tensorflow-metal problems (tf.random.normal) and disappointments
"Last year, I upgraded to an M2 Max laptop, expecting that tensorflow-metal would facilitate effective local prototyping utilizing the Apple Silicon's capabilities. It has been quite some time since tensorflow-metal was last updated, and there appear to be several unresolved issues noted by the community here. I've personally observed the following behavior with my setup: Without tensorflow-metal: import tensorflow as tf for _ in range(10): print(tf.random.normal((3,)).numpy()) [-1.4213976 0.08230731 -1.1260201 ] [ 1.2913705 -0.47693467 -1.2886043 ] [ 0.09144169 -1.0892165 0.9313669 ] [ 1.1081179 0.9865657 -1.0298151] [ 0.03328908 -0.00655857 -0.02662632] [-1.002391 -1.1873596 -1.1168724] [-1.2135247 -1.2823236 -1.0396363] [-0.03492929 -0.9228362 0.19147137] [-0.59353966 0.502279 0.80000925] [-0.82247525 -0.13076428 0.99579334] With tensorflow-metal: import tensorflow as tf for _ in range(10): print(tf.random.normal((3,)).numpy()) [ 1.0031303 0.8095635 -0.0610961] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] [-1.3544159 0.7045493 0.03666191] Given these observations, it seems there may be an issue with the randomness of tf.random.normal when using tensorflow-metal. My current setup includes MacOS 14.5, tensorflow 2.14.1, and tensorflow-macos 2.14.1. I am interested in understanding if there are known solutions or workarounds for this behavior. Furthermore, could anyone provide an update on whether tensorflow-metal is still being actively developed, or if alternative approaches are recommended for utilizing the GPU capabilities of this hardware?
1
0
683
Jul ’24
Div calculation issue in metal
Hi, all. I've been writing various computational functions using Metal. However, in the following operation functions, unlike + and *, there is an accuracy issue in the / operation. This is a function that divides a matrix of shape [n, x, y] and a scalar [1]. When compared to numpy or torch, if I change the operator of the above function to * or + instead of /, I can get completely the same results, but in the case of /, there is a difference in the mean of more than 1e-5. (For reference, this was written with reference to the metal kernel code in llama.cpp) kernel void kernel_div_single_f16( device const half * src0, device const half * src1, device half * dst, constant int64_t & ne00, constant int64_t & ne01, constant int64_t & ne02, constant int64_t & ne03, uint3 tgpig[[threadgroup_position_in_grid]], uint3 tpitg[[thread_position_in_threadgroup]], uint3 ntg[[threads_per_threadgroup]]) { const int64_t i03 = tgpig.z; const int64_t i02 = tgpig.y; const int64_t i01 = tgpig.x; const uint offset = i03*ne02*ne01*ne00 + i02*ne01*ne00 + i01*ne00; for (int i0 = tpitg.x; i0 < ne00; i0 += ntg.x) { dst[offset + i0] = src0[offset+i0] / *src1; } } My mac book is, Macbork Pro(16, 2021) / macOS 12.5 / Apple M1 Pro. Are there any issues related to Div? Thanks in advance for your reply.
1
0
429
Jul ’24
Training a Segmentation model
Hi there,I’m a Computer Science student and I have a MacBook Pro 2019 and I’m thinking in buying a new Mac either a Mac Studio or a MacBook Pro but I want to use it for ML. I’m now doing a segmentation model and I’m wondering if I could use Core Ml or the Apple Neural Engine in the new M3 chips to train it, I’m now using colab and tensorflow to create the model but it’s not doing the job, I’m falling short of Cuda memory. Thanks :)
1
0
432
Jul ’24
Chaining app intents in code
I would like to split up my intents into smaller intents with more atomic pieces of functionality that I can then call one intent from another. For example: struct SumValuesIntent: AppIntent { static var title: LocalizedStringResource { "Sum Values" } let a: Int let b: Int init(a: Int, b: Int) { self.a = a self.b = b } init() { self.init(a: 0, b: 0) } func perform() async throws -> some IntentResult { let sum = a + b print("SumValuesIntent:", sum) return .result(value: sum) } } struct PrintValueIntent: AppIntent { static var title: LocalizedStringResource { "Print Value" } let string: String init(string: String) { self.string = string } init() { self.init(string: "") } func perform() async throws -> some IntentResult { print("PrintValueIntent:", string) return .result() } } What is the best way to chain intents like these? I tried .result(opensIntent: PrintValueIntent(string: String(describing: sum))) as the return type of SumValuesIntent.perform but that doesn't seem to work. Then I tried try await PrintValueIntent(string: String(describing: sum)).perform() as the return type and that works but I'm not sure that's the correct way to do it.
0
1
535
Jul ’24
EntityPropertyQuery with property from related entity
Hi, I am working on creating a EntityPropertyQuery for my App entity. I want the user to be able to use Shortcuts to search by a property in a related entity, but I'm struggling with how the syntax for that looks. I know the documentation for 'EntityPropertyQuery' suggests that this should be possible with a different initializer for the 'QueryProperty' that takes in a 'entityProvider' but I can't figure out how it works. For e.g. my CJPersonAppEntity has 'emails', which is of type CJEmailAppEntity, which has a property 'emailAddress'. I want the user to be able to find the 'person' by looking up an email address. When I try to provide this as a Property to filter by, inside CJPersonAppEntityQuery, but I get a syntax error: static var properties = QueryProperties { Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in person.emails // error }) { EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) } ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) } } } The error says "Cannot convert value of type '[CJPersonEmailAppEntity]' to closure result type 'CJPersonEmailAppEntity'" So it's not expecting an array, but an individual email item. But how do I provide that without running the predicate query that's specified in the closure? So I tried something like this , just returning something without worrying about correctness: Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in person.emails.first ?? CJPersonEmailAppEntity() // satisfy compiler }) { EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) } ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) } } and it built the app, but failed on another the step 'Extracting app intents metadata': error: Entity CJPersonAppEntity does not contain a property named emailAddress. Ensure that the property is wrapped with an @Property property wrapper So I'm not sure what the correct syntax for handling this case is, and I can't find any other examples of how it's done. Would love some feedback for this.
3
0
682
Jul ’24
muting SIRI via AppIntents
We're using App Intents to launch are control our app via Siri. Siri's responses have been fairly random, some with a "Done" popup, others with a verbal confirmation, others saying "I'm sorry, there's been a problem". The latter is bogus and doesn't look good to potential investors when the app is actually working fine. There appears to be no way in code that I've been able to find so far that would have been tell Siri to STFU. Let us handle our own errors. Otherwise is there a means to supply Siri with a dictionary of restored messages that could be triggered inside the app?
1
0
583
Jul ’24
DisplayRepresentation.Image and custom SF Symbol
Hi, I have a AppEnum and I try to use a custom SF Symbol as DisplayRepresentation.Image but it's not working. I get a blank image when AppEnum picker appears. The custom SF Symbol is stored in the target's asset catalog and it works in the target (eg: Image(named: "custom_sfsymbol"). The issue occurs when I try to use it in a DisplayRepresentation static var caseDisplayRepresentations: [Self: DisplayRepresentation] = [ .sample : DisplayRepresentation(title: "sample_title", image: DisplayRepresentation.Image(named: "custom_sfsymbol")), Can we use a custom SF Symbol in a DisplayRepresentation.Image?
1
0
424
Jul ’24
SiriKit Extension still needed with AppIntents?
I have followed the SoupChef example in migrating Custom Intents from SiriKit to AppIntents. However, we only require one iOS release back, so we can require iOS 17. Thus, I eliminated everything that was strictly for backwards compatibility, most notably the SiriKit Extension that required enormous amounts of code to try to coordinate with the real app which worked poorly anyway. I tested for example that an NFC tag Automation created in Shortcuts works to execute an AppIntent while the app is backgrounded. I am now receiving a beta report that indicates someone trying to execute one of our migrated AppIntents from their HomePod is not working, and they say it used to work sometimes (not all the time). I'm sure most such cases used to require the SiriKit Extension in the old SiriKit world. I am terrified that I may need to rebuild that monster once again when the new (to me) AppIntent API seemed so beautiful without it and seemed to work without it. The AppIntent API documentation seems to indicate that SiriKit Extensions are no longer related or required. What is the truth here? Do I need to re-implement everything twice in the SiriKit Extension like a barbarian, or can we live in the new world with AppIntents? Thank you.
1
0
744
Jul ’24
App Intents "Text" parameters seem broken in iOS 17.6 beta
I'm getting widespread reports from users trialling iOS 17.6 public beta that Siri Shortcuts are failing whenever they enter any text that looks like a URL. It's getting reported to me because my app happens to have an app intent with a string parameter which can contain a URL in some circumstances. However it's easily reproducible outside of my app: just create a 2 line shortcut like the one below. If you change "This is some text" to "https://www.apple.com" the shortcut below will fail: In iOS 17.5 entering "https://www.apple.com" works fine. I've raised feedback on this (FB14206088) but can anyone confirm that this is indeed a bug and not some weird new feature of Shortcuts where the contents of a variable can somehow change the type of a variable? It would be very, very bad if this were so.
1
0
572
Jul ’24
FactoryInstall Unable to query results, error: 5
I am developing an iPhone application. When I start testing in a simulator or on an actual device, I get the following message depending on the model. I don't see any problem with the actual operation of the app, but I don't know how to resolve this error. #FactoryInstall Unable to query results, error: 5 Unable to list voice folder Unable to list voice folder Unable to list voice folder Unable to list voice folder Unable to list voice folder I have tried to resolve the problem by following the steps below, but it hasn’t had any affect on the error. Is it OK to leave this error as it is? If you know how to resolve it, please let me know. Clear the Xcode cache: Go to the path of the DerivedData folder and delete all of its contents. Clean Build folder: Select "Product" -> "Clean Build Folder" from the Xcode menu. Restart: Restart Xcode and the simulator. Software update: Make sure you are using the latest version of Xcode and macOS. Reinstallation: Uninstall Xcode once and reinstall it. Reset the simulator
1
4
747
Jul ’24
openAppWhenRun makes AppIntent crash when launched from Control Center.
Adding the openAppWhenRun property to an AppIntent for a ControlWidgetButton causes the following error when the control is tapped in Control Center: Unknown NSError The operation couldn’t be completed. (LNActionExecutorErrorDomain error 2018.) Here’s the full ControlWidget and AppIntent code that causes the errorerror: Should controls be able to open apps after the AppIntent runs, or is this a bug?
5
2
1.7k
Jul ’24
Can you match a new photo with existing images?
I'm looking for a solution to take a picture or point the camera at a piece of clothing and match that image with an image the user has stored in my app. I'm storing the data in a Core Data database as a Binary Data object. Since the user also takes the pictures they store in the database I think I cannot use pre-trained Core ML models. I would like the matching to be done on device if possible instead of going to an external service. That will probably describe the item based on what the AI sees, but then I cannot match the item with the stored images in the app. Does anyone know if this is possible with frameworks as Vision or VisionKit?
2
0
760
Jun ’24
Code missing from WWDC session video 10210
I noticed the code snippets are missing from the wwdc2024 10210 video titled Bring your app’s core features to users with App Intents https://developer.apple.com/videos/play/wwdc2024/10210/ It would be useful if those could be added. I also noticed the transcript is missing from the web version but it is in the Developer app, that is odd.
1
1
571
Jun ’24