We’ve been trying several style transfer models available on public domain on iOS with Metal Performance Shaders CNN, mlmodelzoo CoreML models and we are not happy with its performance yet.
For MPSCNN we use 512 x 512 output images and for mlmodelzoo’s models we have to use 480 x 640. On an iPhone 6 the best MPSCNN gave us something like 0.8 secs / image and mlmodelzoo’s gave us near 5 secs / image. Note that this does not mean that CoreML is slower, the model used was the key, though.
By looking to some known apps out there we saw that Apple Clips has two filters that really run real time! **** real time!
Our best results seem to be close to Facebook app (that has also a few filters with style transfer).
Any idea what are the models used in Apple Clips?