Posts

Post not yet marked as solved
8 Replies
I'm getting a steadily increasing number of user complaints about restore of in-app purchases failing. (I have 2 apps and complaints are about both) I've done a deep dive into their in-app digital receipt bundle and these seem to be missing past purchases. Hasn't this problem been fixed or is likely to be a new one?
Post marked as solved
2 Replies
Thanks! I just needed someone to tell me I was correct in my assumptions on the options I had and that there wasn't anything, new or old, that I had missed and not considered.
Post not yet marked as solved
2 Replies
It really boils down to securely passing the private key for decryption to a device which is then stored in keychain or similar. Apple appears to have a solution for this but it is limited to their own CoreML models. I'm tempted to (re) ask this question framed around how to get a private key into an app's keychain securely using iCloud or similar? A good summary of CoreML's new encryption solution is given on the website machinethink dot net in their blog. For some unexplained reason I couldn't post a link to it, or any site, here in my reply. If you want a quick overview go read their latest blog entry titled "Apple machine learning in 2020: What’s new?"
Post marked as solved
11 Replies
I thought I'd update this thread to let you know that we succeeded in running that large JS model on a 1G memory device (iPad Mini 2) by splitting out the data tables into individual json files which we load into the JSContext individually. The App is written in ObjC and this didn't work initially until adding @autoreleasepool {...} around the code loading the json from file to iOS data to JS. for (NSString *file in dataFiles) {     if ([[[file pathExtension] lowercaseString] isEqualToString:@"json"] == NO) {         continue;     }     NSString *variableName = [file stringByDeletingPathExtension];     @autoreleasepool {         self.jsContext[variableName] = [FileUtils loadJSONFile:[dataFilesPath stringByAppendingPathComponent:file]];     } } The JS code itself benefited from using var instead of const which made a significant difference to memory usage. Load time is slow on an iPad Mini 2 at around ~76 seconds but if the context is preserved execution time is only a few seconds on each analysis run. iPad Pro loads in 14s which isn't far from what we got when loading the single file JS model. So we appear to have a very acceptable solution.
Post marked as solved
11 Replies
The data scientist has confirmed that the data in the JS is just plain ASCII text with very long lines. From smaller, readable, models I've been able to look at I can add that they appear to be dictionaries of real numbers. e.g. var coefs_ = {"1":-3.601963153,"2":-0.817495724,"3":-0.0812173088, ......
Post marked as solved
11 Replies
To the "Frameworks Engineer", sorry I missed your first reply. For some reason I only saw Eskimo's replies to me. I think your initial hunch is probably correct so I'm going to ask the data scientist to build the JS code without data and pass it as a parameter to be parsed by JSON.parse at run time. I'm assuming that's what you meant? I'll need to seek permission to share any data as each AI model is proprietary IP of the company's clients. The App is a B2B enterprise app and not a consumer app. One of the other things we are looking into is how to secure these models within iOS as they do represent significant IP but that's a separate issue. I'll find out tomorrow from the data scientist if there's anything non-ASCII in the string. Thanks!
Post marked as solved
11 Replies
I've done some more testing and it looks like I was wrong about the crash point. It isn't during execution that it crashes but at line 15 below at evaluateScript when the code is loaded. 		self.jsContext = [[JSContext alloc] initWithVirtualMachine:[JSVirtualMachine new]];     self.jsContext[@"console"][@"log"] = ^(NSString *message){         NSLog(@"JS Console: %@", message);     };     self.jsContext.exceptionHandler = ^(JSContext *context, JSValue *exception) {         NSLog(@"JS Error: %@", exception);     };     NSString *model = [[NSString alloc] initWithContentsOfFile:pathToModel                                                       encoding:NSUTF8StringEncoding                                                          error:&error];     if (error) {         NSLog(@"setUpJSContextWithModel: model load failed: %@",error);         return;     }     JSValue *result = [self.jsContext evaluateScript:model withSourceURL:[NSURL fileURLWithPath:pathToModel]]; Sorry for mis-leading in my initial post. Also Xcode's debug memory profiler seems to suggest that the upper limit is about 60% of the device memory. For example an iPhone X with 3Gb of memory crashes after the memory profile reaches 59.4% of memory. The data scientist who is building these models sent this to me today: I profiled the memory usage when running the models on my Mac, in JavaScriptCore and in Node.js. The full-357MB used 2.42GB in JavaScriptCore vs. 799MB in Node.js.I think some of that excess usage might come from the way the model-parameters are being defined in the JS file (we start with them in a separate JSON and use a tool to automatically pack everything into one file) - basically, we might start with them in string-form and then also have to JSON.parse that string, almost doubling the footprint. There’s still an overhead from ~700 to 799MB unaccounted for, but I don’t think this huge memory usage is (solely) down to poor coding by me! Is there anything we can do to improve memory efficiency in JavaScriptCore? Thanks!
Post marked as solved
11 Replies
TestFlight reports one user as having seen the crashes and includes their feedback but no stack trace. I also got Xcode's Organiser to check for Crashes that could be downloaded but nothing was found.
Post marked as solved
11 Replies
Yes, sorry, I should also have mentioned that I don't know how it crashes as the crash is so spectacular that Xcode reports nothing. I set break points to make sure it was the execution that it crashed in because there is no stack trace reported. In the function below the crash occurs at line 12 so I know the VM loads the JS but then dies on execution. Running on my iPhone X I selected the debug navigator top left of Xcode and watched memory usage shoot up to 1.74G before it ***** out. That was loading the 357Mb JS file. The device is cleared from Xcode and is no longer selectable until I recompile and reload. (Xcode 11.5, iOS12 for iPad Mini and iOS13.5 on all other devices) We've run additional tests using models of varying sizes on all the platforms that we have so we can figure out the limits. Devices with 2Gb of memory can run JS files up to 150Mb. It maybe higher but that was as high as we went other than the 357Mb file that only my iPad Pro will run. We plan to split the processing over multiple files passing the result from each execution between them for now. (NSDictionary *) processBatchFile: (NSString *) filePath {     NSError *error;     NSString *jsonData = [NSString stringWithContentsOfFile:filePath                                                    encoding:NSUTF8StringEncoding                                                       error:&error];     if (error) {         NSLog(@"processBatchFile: batch load failed. %@",error);         return Nil;     }     NSLog(@"processBatchFile: %@",[filePath lastPathComponent]);     JSValue *jsProcessor = self.jsContext[@"processBatch"];     JSValue *result = [jsProcessor callWithArguments:@[jsonData, self.configs]];     NSLog(@"processBatchFile: result = %@",[result toDictionary]);     return [result toDictionary]; }
Post not yet marked as solved
3 Replies
Testing on my iPhone X with a S4 Apple Watch (iOS 13.2.1 & Watch OS 6.1) it appears to work fine.So this must be a breakage in the simulator (11.2.1) and not the live hardware versions.