Post

Replies

Boosts

Views

Activity

Could not create inference context
While trying to learn about coreML, I ran into an issue. Using a model from Apple's website (MobileNetV2) I had no issues. When I tried to use my own model that I created, I ran into the issue and the localized description was "Could not create inference context" when using the iPhone simulator. After a quick search I tested this using the arm64 simulator and it worked just fine. I believe this is an m1 related bug because another forum said it worked without any issues on intel Mac, but not their m1.         if let data = self.animal.imageData {             do {                 let modelFile = try! DogorCatmodel1(configuration: MLModelConfiguration())                 let model = try VNCoreMLModel(for: modelFile.model)                                  let handler = VNImageRequestHandler(data: data)                                  let request = VNCoreMLRequest(model: model) { (request, error) in                     guard let results = request.results as? [VNClassificationObservation] else {                         print("Could not classify")                         return                     }                                          for classification in results {                         var identifier = classification.identifier                         identifier = identifier.prefix(1).capitalized + identifier.dropFirst()                         print(identifier)                         print(classification.confidence)                     }                 }                                  do {                     try handler.perform([request])                 }                 catch {                     print(error.localizedDescription)                     print("Invalid")                 }             }             catch {                 print(error.localizedDescription)             }         }     }
3
2
3k
Dec ’21