After reading a bunch of related posts, it looks like it might be related to the compute units and flexible shapes and that the ANE doesn't support it. I bet disabling the MLE5Engine flag coincidentally is the equivalent of
config.computeUnits = .cpuAndGPU
I bet the arch running on the neural engine is creating the nonsensical outputs, where constrained to cpu/gpu is returning normal behavior. Going to test and see if I can confirm.
Post
Replies
Boosts
Views
Activity
Directly related https://developer.apple.com/forums/thread/771911
https://github.com/ultralytics/ultralytics/issues/5165
Somehow related, as it's really really the only other mention of this flag I can find https://developer.apple.com/forums/thread/735698
https://developer.apple.com/search/?q=MLE5Engine
Dec 2024 update:
Same, randomly. No clue how to reproduce.
I don't understand how it's being added in during torchscript conversion? I can add the new soft max with validating lossLayer, but I can't figure out how to swap/pop the original SoftmaxND. I thought del builder.layer[-1] would be enough, but NOOP. So now I'm stuck with protobuf validations that that multiple layers are consuming the same outputs
Same, randomly. No clue how to reproduce.
This is happening everywhere.
This is the only useful information I could find, no clue how they came up with that temp workaround, though. Seems like they might have used the red phone.
https://github.com/googleads/swift-package-manager-google-mobile-ads/issues/62#issuecomment-1981219533
Re-tooling commits branching platform for path differences https://github.com/firebase/firebase-ios-sdk/pull/12517/files
https://github.com/facebook/facebook-ios-sdk/issues/2353
https://github.com/google/GoogleAppMeasurement/issues/62
etc, etc.
I target 17, there is no runtime to produce 5.5" screenshots, besides literally faking it, which seems very anti-spirit and nonsensical. Apple you have sent me 3+ emails on preparing my app for vision, including how to properly "frame" screenshots. I'll get on that as soon as I'm done uploading iPhone 8 Plus assets.
xoxoxo
Somehow got it to export by composing the transformer explicitly, which gave a
ComposedTransformer<ImageFeaturePrint, FullyConnectedNetworkMultiLabelClassifierModel>, so I'm guessing the chaining used in the documentation was never attempted to export since return type is invalid
let annotatedFeatures = detectionFiles.map {
AnnotatedFeature(
feature: directoryURL.appending(component: $0.filename),
annotation: $0.labels
)
}
let reader = ImageReader()
let (training, validation) = annotatedFeatures.randomSplit(by: 0.8)
let featurePrint = ImageFeaturePrint(revision: 2)
let classifier = FullyConnectedNetworkMultiLabelClassifier<Float, String>(labels: labels)
let task = featurePrint.appending(classifier)
Task {
let trainingImages = try await reader.applied(to: training)
let validationImages = try await reader.applied(to: validation)
let model = try await task.fitted(
to: trainingImages,
validateOn: validationImages,
eventHandler: { event in
debugPrint(event)
}
)
try! model.export(to: modelFile)
}
I'm getting a failure on export using the exact transformer used in this guide https://developer.apple.com/documentation/createml/creating-a-multi-label-image-classifier
ComposedTransformer<ComposedTransformer<ImageReader, ImageFeaturePrint>, FullyConnectedNetworkMultiLabelClassifierModel> The transformer is not representable as a CoreML model (ImageReader).
I'm guessing this is just a misleading error?
In this intro with no runnable examples https://developer.apple.com/videos/play/wwdc2022/10019/ also noticed this line
try estimator.write(transformer, to: parametersURL)
which writes a .pkg file which I'm guessing is for instantiating the estimator
Where is this extension's entitlements documented? https://developer.apple.com/documentation/bundleresources/entitlements
So the official toolchain showcased https://developer.apple.com/augmented-reality/tools/ and all over RealityKit, requires x86_64 prebuilt and 3.7.9, even further backwards from shipped system /usr/bin/python3?
Apple should donate some hardware to whoever is maintaining the dependency that is preventing dogfooding on flagship hardware
Same. I've tried every possible x265 config combination i could think of
Any internal employee who acts as a somewhat-advocate for developers please pass this feedback: Please make sure developers are able to build, submit, develop against software being released to the public before i get an OTA delivery to my own device.
That should include confirmation on latest toolchain software, including the Beta release train as well. It is a reasonable request to not have to perform this traditional rain dance. Before I get lectured about 11E801a, i'll leave you with some inspirational quotes.
You can't build for the future, if you can't compile against the past. - Plato
Some people want to go to mars. I just want to unbrick my phone without sudo xcode-select -s when my phone auto installs publicly released covid19 hot patch - neil armstrong
Don't forget the most important part!
Open a port to your mac mini
self sign a cert
add NSAppTransportSecurity exceptions to your ios app
DONT RESTART YOUR DSL ROUTER
Submit mobile app
The world is your oyster