Issues with loading ML Program into MLModelAsset

Good evening,

I am encountering an issue when trying to use this method:

modelAssetWithSpecificationData:blobMapping:error:

I provide the specification data and a blobMapping.

In the generated spec I can see the weights are referenced with this way:

@model_path/weights/weight.bin

In model.mlmodel.

I populate my blob mapping with the content of weights.bin with a URL that matches the above string.

However, I don't believe the blobMapping is ever being accessed because I get the following error:

Error: compiler error: Encountered an error while compiling a neural network model: in operation const_0_to_fp16: Cannot interpret @model_path when ModelPath is unset

I can confirm that the operation is the first use of weights in the model.mlmodel

I have not yet found a way to set ModelPath or to change that string.

Any assistance would be greatly appreciated.

Thanks for the post and sorry for the delay.

Do you get the same results with just the relevant code in a small test project? If so, please share a link to your test project. That'll help us better understand what's going on. If you're not familiar with preparing a test project, take a look at Creating a test project.

Albert Pascual
  Worldwide Developer Relations.

Issues with loading ML Program into MLModelAsset
 
 
Q