Post

Replies

Boosts

Views

Activity

Reply to The yolo11 object detection model I exported to coreml stopped working in macOS15.2 beta.
After reading a bunch of related posts, it looks like it might be related to the compute units and flexible shapes and that the ANE doesn't support it. I bet disabling the MLE5Engine flag coincidentally is the equivalent of config.computeUnits = .cpuAndGPU I bet the arch running on the neural engine is creating the nonsensical outputs, where constrained to cpu/gpu is returning normal behavior. Going to test and see if I can confirm.
3d
Reply to Getting ValueError: Categorical Cross Entropy loss layer input (Identity) must be a softmax layer output.
I don't understand how it's being added in during torchscript conversion? I can add the new soft max with validating lossLayer, but I can't figure out how to swap/pop the original SoftmaxND. I thought del builder.layer[-1] would be enough, but NOOP. So now I'm stuck with protobuf validations that that multiple layers are consuming the same outputs
Nov ’24
Reply to Build Export differences between Xcode 15.2 and 15.3
This is happening everywhere. This is the only useful information I could find, no clue how they came up with that temp workaround, though. Seems like they might have used the red phone. https://github.com/googleads/swift-package-manager-google-mobile-ads/issues/62#issuecomment-1981219533 Re-tooling commits branching platform for path differences https://github.com/firebase/firebase-ios-sdk/pull/12517/files https://github.com/facebook/facebook-ios-sdk/issues/2353 https://github.com/google/GoogleAppMeasurement/issues/62 etc, etc.
Mar ’24
Reply to [Create ML Components] The transformer is not representable as a CoreML model (ImageReader).
Somehow got it to export by composing the transformer explicitly, which gave a ComposedTransformer<ImageFeaturePrint, FullyConnectedNetworkMultiLabelClassifierModel>, so I'm guessing the chaining used in the documentation was never attempted to export since return type is invalid let annotatedFeatures = detectionFiles.map { AnnotatedFeature( feature: directoryURL.appending(component: $0.filename), annotation: $0.labels ) } let reader = ImageReader() let (training, validation) = annotatedFeatures.randomSplit(by: 0.8) let featurePrint = ImageFeaturePrint(revision: 2) let classifier = FullyConnectedNetworkMultiLabelClassifier<Float, String>(labels: labels) let task = featurePrint.appending(classifier) Task { let trainingImages = try await reader.applied(to: training) let validationImages = try await reader.applied(to: validation) let model = try await task.fitted( to: trainingImages, validateOn: validationImages, eventHandler: { event in debugPrint(event) } ) try! model.export(to: modelFile) }
Sep ’23
Reply to Create ML Multi-Label Image Classifier will not train, throws error: "Unexpected Error"
I'm getting a failure on export using the exact transformer used in this guide https://developer.apple.com/documentation/createml/creating-a-multi-label-image-classifier ComposedTransformer<ComposedTransformer<ImageReader, ImageFeaturePrint>, FullyConnectedNetworkMultiLabelClassifierModel> The transformer is not representable as a CoreML model (ImageReader). I'm guessing this is just a misleading error? In this intro with no runnable examples https://developer.apple.com/videos/play/wwdc2022/10019/ also noticed this line try estimator.write(transformer, to: parametersURL) which writes a .pkg file which I'm guessing is for instantiating the estimator
Sep ’23
Reply to USDZ Tools on Apple Silicon
So the official toolchain showcased https://developer.apple.com/augmented-reality/tools/ and all over RealityKit, requires x86_64 prebuilt and 3.7.9, even further backwards from shipped system /usr/bin/python3? Apple should donate some hardware to whoever is maintaining the dependency that is preventing dogfooding on flagship hardware
May ’23
Reply to iOS 13.7 iPhone 7 not recognised by Xcode 11.7
Any internal employee who acts as a somewhat-advocate for developers please pass this feedback: Please make sure developers are able to build, submit, develop against software being released to the public before i get an OTA delivery to my own device. That should include confirmation on latest toolchain software, including the Beta release train as well. It is a reasonable request to not have to perform this traditional rain dance. Before I get lectured about 11E801a, i'll leave you with some inspirational quotes. You can't build for the future, if you can't compile against the past. - Plato Some people want to go to mars. I just want to unbrick my phone without sudo xcode-select -s when my phone auto installs publicly released covid19 hot patch - neil armstrong
Sep ’20