pixelFormat (11) is not a valid MTLPixelFormat

I'm receiving this fatal exception in iOS 13 beta 1 when using my Core ML image classification model


MTLDebugValidateMTLPixelFormat, line 1388: error 'pixelFormat (11) is not a valid MTLPixelFormat.'

MTLDebugValidateMTLPixelFormat:1388: failed assertion `pixelFormat (11) is not a valid MTLPixelFormat.'


guard let drawingImage = canvasView.currentDrawing.rasterized()?.cgImage,
    let model = try? VNCoreMLModel(for: Symbols().model)
    else { return }
         
let classificationRequest = VNCoreMLRequest(model: model) { [weak self] (request, error) in
    self?.processClassifications(for: request, error: error)
}
classificationRequest.imageCropAndScaleOption = .centerCrop
         
let handler = VNImageRequestHandler(cgImage: drawingImage)
try? handler.perform([classificationRequest]) //crashes here

Replies

I do receive the exact same error when I run the following code:


    let cgOutputImage = ciContext.createCGImage(
      outputImage,
      from: outputImage.extent,
      format: CIFormat.BGRA8,
      colorSpace: .p3OrSRGB
    )


on the iOS Simulator under macOS 10.15 beta 2 with Xcode 11.0 beta 2. The same happend with the first beta versions.

It is a Metal Core Image context:


(lldb) po ciContext
    priority: default
    workingSpace:  (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; sRGB IEC61966-2.1; extended range)
    workingFormat: RGBAh
    downsampleQuality: Low


and the `outputImage` (input to the `createCGImage` call) is defined as follows:


(lldb) po outputImage
  kernel coreImageKernel(u_image,u_lookupTable,u_intensity=1,u_horizontalTileCount=5,u_verticalTileCount=5,u_resolution=128) extent=[0 0 144 144]
    affine [1 0 0 -1 0 144] extent=[0 0 144 144]
      colormatch sRGB_to_workingspace extent=[0 0 144 144]
        IOSurface 0x600001a14b20(176) seed:1 BGRA8 extent=[0 0 144 144]
    affine [1 0 0 -1 0 128] extent=[0 0 128 128] opaque
      colormatch GenericGrayGamma2_2_to_workingspace extent=[0 0 128 128] opaque
        IOSurface 0x600001a0c4c0(174) seed:1 L8 alpha_one extent=[0 0 128 128] opaque


The same code runs without any errors on a real device.

Had the same issue, but figured it out. It's because of the format of the texture. I was using a JPG file that I was loading into a UIImage, then handing that to the shader. Worked fine on a real device, but not in the simulator.

I took the JPG and loaded it in Photoshop, change the color space to 'Apple RGB', then saved it as a PNG instead of JPG and it now works great in the simulator and on a device.

You can find more information in Apple's developer documentation here where they specifically say certain formats will not work in the simulator.

Hope this helps!