I implement a custom pytorch layer on both CPU and GPU following [Hollemans amazing blog] (https://machinethink.net/blog/coreml-custom-layers ). The cpu version works good, but when i implemented this op on GPU it cannot activate "encode" function. Always run on CPU. I have checked the coremltools.convert()
options with compute_units=coremltools.ComputeUnit.CPU_AND_GPU
, but it still not work. This problem also mentioned in https://stackoverflow.com/questions/51019600/why-i-enabled-metal-api-but-my-coreml-custom-layer-still-run-on-cpu and https://developer.apple.com/forums/thread/695640. Any idea on help this would be grateful.
System Information
- mac OS: 11.6.1 Big Sur
- xcode: 12.5.1
- coremltools: 5.1.0
- test device: iphone 11