Core ML Model Performance report support Neural Engine but not run on it

Deploy machine learning and AI models on-device with Core ML say the performance report can see the ops run on which unit and why it cannot run on Neural Engine.

I tested my model and the report shows a gray checkmark at the Neural Engine, indicating it can run on the Neural Engine. However, it's not executing on the Neural Engine but on the CPU. Why is this happening?

This is probably because the prior op (img1_1_cast_fp16) could not run on the Neural Engine. As I understand it, switching between compute units incurs overhead and CoreML will decide to not switch back sometimes to avoid incurring that overhead.

Per the WWDC session video, hovering should tell you why it’s not supported. Maybe that is broken?

Thanks @smpanaro! Hovering should reveal the details. @timyao18 could you please try the model with Xcode 16 beta 2 and if hovering doesn't work then please file a radar with the model included.

Core ML Model Performance report support Neural Engine but not run on it
 
 
Q