Post

Replies

Boosts

Views

Activity

Reply to Summation errors in average pooling of CoreML on M1 devices and Big Sur
I don't know what the answer is, but I would try running the model in CPU-only or GPU mode to see if it happens there too. Perhaps it's an issue with the ANE. In any case, these developer forums are pretty deserted. If this is an important issue for your business (and it sounds like it is), you should use one of your Tech Support Incidents or contact a developer evangelist from Apple.
Jan ’21
Reply to Keras model using YCbCr input converted to ARGB model
Core ML models can only accept RGB or BGR images as inputs (the alpha channel is ignored). If your model wants a YCbCr image, you'll need to put the image into an MLMultiArray instead. You'll also need to change the input of the Core ML model to be a multi-array, not an image. Alternatively, you could add some layers at the start of the model that convert RGB back to YCbCr, but that seems like more trouble than it's worth.
Jan ’21
Reply to Change macOS "Availability" of CoreML models converted by CoreMLTools?
It depends on what features your model uses. If it has a layer type that is only supported by macOS 10.15+, then the model cannot be used on earlier versions of macOS. However, if your model only has the most basic features from Core ML version 1, it can be used on macOS 10.13. There is a field in the mlmodel file named specificationVersion. Set this to 1 to make the model available on 10.13. However, Core ML will give an error if it turns out this model has layer types that are not supported by 10.13. import coremltools as ct spec = ct.utils.load_spec("YourModel.mlmodel")  spec.specificationVersion = 1 ct.utils.save_spec(spec, "YourNewModel.mlmodel")
Oct ’20
Reply to coreML xcode 12
I have had all kinds of weird errors with custom layers in Xcode 12, including the app crashing when trying to load the model. My solution: use coremlc from Xcode 11 to compile the model by hand, add the mlmodelc folder to your project, remove the mlmodel file from the project, and load the model by passing in the URL to the mlmodelc folder in your app bundle. You will also need to copy&paste the auto-generated source file from Xcode 11 into your Xcode 12 project (since you no longer have an mlmodel, it also doesn't generate the source file anymore).
Oct ’20
Reply to Xcode 12 ML Model Compatibility
Here is a workaround: Open the project in Xcode 11. Go to the auto-generated source file for your model. Copy and paste that into a new Swift source file and add this to your project. Disable source code auto-generation for Core ML models in the project settings. Now open the project in Xcode 12 and it should no longer give that error.
Oct ’20
Reply to CoreML - Label presented as random string
The labels are stored in your mlmodel file. If you open the mlmodel in Xcode 12, it will display what those labels are. My guess is that instead of actual labels, your mlmodel contains "CICAgICAwPmveRIJQWdsYWlzX2lv" and so on. If that is the case, you can make a dictionary in the app that maps "CICAgICAwPmveRIJQWdsYWlzX2lv" and so on to the real labels, or you can replace these labels inside the mlmodel file by editing it using coremltools. (My e-book Core ML Survival Guide has a chapter on how to replace the labels in the model.)
Sep ’20
Reply to Style transfer models
If I was a visual artist, I'd skip Create ML and learn to train my own style transfer models using TensorFlow or PyTorch. Create ML is fun for simple stuff, but you'll quickly run into its limitations and there is no way to work around those. For example, the Create ML style transfer models look like they're limited to 512x512 images (perhaps that's just the one for video, I didn't look closely). If you don't already have a Linux computer with a nice GPU lying around, that means doing work in the cloud (which is often free for a number of hours).
Jun ’20
Reply to How to create a core ml model programmatically
Yes, there is a way but you're not going to like it. ;-) You'll have to write your own training code from scratch so that it can run on the phone. Then to turn this into a Core ML model, you can write protobuf messages to a new mlmodel file. Finally, you can load this mlmodel using the Core ML API, compile it on the device, and then you can use it make predictions. But if you already have to write your own training code, it also makes sense to write your own inference code and skip Core ML altogether.
Jun ’20