Classifying SwiftUI Paths using Core ML

I am working on an app that classifies drawings made by user. Currently, the user made drawings are represented as a set of SwiftUI Paths, and I have a trained Core ML model that takes images and outputs class names.

I have written some code that is able to take in images in the form of UIImages and feed it into my classifier, but I am unsure of how I should adapt my code to take Paths.

Here is my current code:

Code Block
import UIKit
import CoreML
import Vision
import ImageIO
import SwiftUI
struct ImageClassifier{
var classifier = SymbolClassifier()
func classify(image: CGImage) -> String?{
let pixelBuffer = image.pixelBuffer(width: 300, height: 300, orientation: CGImagePropertyOrientation.up)!
let output = try? self.classifier.prediction(image: pixelBuffer)
return output?.classLabel
}
func classifyUIImage(image: UIImage)-> String?{
guard let imageAsUIImage:CGImage = convertUIImageToCGImage(image: image) else{
return nil
}
return classify(image: imageAsUIImage)
}
func classifyPath(path:Path) -> String?{
//???
return nil
}
func convertUIImageToCGImage(image: UIImage) -> CGImage? {
let inputImage = CIImage(image: image)!
let context = CIContext(options: nil)
return context.createCGImage(inputImage, from: inputImage.extent)
}
}


Here is the image.pixelBuffer library and here is the model
Since you trained the model to use images as input, you will have to draw the SwiftUI paths into an image first.
Classifying SwiftUI Paths using Core ML
 
 
Q