Post

Replies

Boosts

Views

Activity

Reply to Does Core ML 3 support Python 3?
Core ML doesn't support Python at all, as it's the API for doing inference on iOS and macOS devices. It's written in Obj-C and C++ and has an Obj-C and Swift API.coremltools, which is a Python package for building Core ML models and for converting them from other tools such as TensorFlow, works both on Python 2.7 and Python 3.x.(It used to work only with Python 2, but that was fixed several years ago. Since Python 2 is deprecated, you should really use it with Python 3.)
May ’20
Reply to Record video and Classify using YOLO at the same time
It depends a little on the device you're using, but YOLO can be quite slow (especially the full version of YOLO). If YOLO runs at 15 FPS, for example, and you block the AVCapture thread, then it will automatically drop frames because your code isn't able to keep up. One solution would be to use a queue of size 1, and make Core ML read from this queue from a separate thread. The AVCapture thread simply appends its new frames to this queue, saves the frame to the movie file using AVAssetWriter, and then waits for the next frame to come in. (Because the queue is size 1, effectively this always overwrites the old frame.) Now the AVCapture thread will never be blocked for long amounts of time, and you won't drop any frames in the video. (Of course, YOLO will not see all frames.)
Jun ’20
Reply to When is Preprocessing necessary
Vision does indeed resize your images, according to the VNImageCropAndScaleOption that you set on the VNCoreMLRequest object. If you're not using Vision, but you're using the Core ML API, you'll have to do the resizing yourself. Or you can give the model flexible inputs so that it can handle images of different sizes.
Jun ’20
Reply to Core ML Converters and custom libraries on top of Pytorch
With coremltools 4, you can directly convert PyTorch models to Core ML. To do this, you must create a "trace" first. This involves running the PyTorch model on a dummy input. It then captures all the operations the PyTorch model performs. It's quite possible that the trace of a fastai model doesn't really contain any custom operations, since fastai is mostly a wrapper around PyTorch. So I'd simply try to do the conversion first -- it might just work. ;-)
Jun ’20
Reply to How to create a core ml model programmatically
Yes, there is a way but you're not going to like it. ;-) You'll have to write your own training code from scratch so that it can run on the phone. Then to turn this into a Core ML model, you can write protobuf messages to a new mlmodel file. Finally, you can load this mlmodel using the Core ML API, compile it on the device, and then you can use it make predictions. But if you already have to write your own training code, it also makes sense to write your own inference code and skip Core ML altogether.
Jun ’20
Reply to Style transfer models
If I was a visual artist, I'd skip Create ML and learn to train my own style transfer models using TensorFlow or PyTorch. Create ML is fun for simple stuff, but you'll quickly run into its limitations and there is no way to work around those. For example, the Create ML style transfer models look like they're limited to 512x512 images (perhaps that's just the one for video, I didn't look closely). If you don't already have a Linux computer with a nice GPU lying around, that means doing work in the cloud (which is often free for a number of hours).
Jun ’20
Reply to CoreML - Label presented as random string
The labels are stored in your mlmodel file. If you open the mlmodel in Xcode 12, it will display what those labels are. My guess is that instead of actual labels, your mlmodel contains "CICAgICAwPmveRIJQWdsYWlzX2lv" and so on. If that is the case, you can make a dictionary in the app that maps "CICAgICAwPmveRIJQWdsYWlzX2lv" and so on to the real labels, or you can replace these labels inside the mlmodel file by editing it using coremltools. (My e-book Core ML Survival Guide has a chapter on how to replace the labels in the model.)
Sep ’20
Reply to Xcode 12 ML Model Compatibility
Here is a workaround: Open the project in Xcode 11. Go to the auto-generated source file for your model. Copy and paste that into a new Swift source file and add this to your project. Disable source code auto-generation for Core ML models in the project settings. Now open the project in Xcode 12 and it should no longer give that error.
Oct ’20