iOS 12 - CGImage initializer returning nil

I'm trying to create a CGImage from a data provider using this initializer, but it is now returning nil on iOS 12 devices. It works on older iOS versions. What could be the cause of this?

Accepted Reply

Hi all, I was able to fix my problem by using this initializer instead. Thank you all for your help!

Replies

Coulmd you show your exact init call ?

Sure!


if

let cgImage = image.cgImage,

let maskCGImage = maskImage?.cgImage,

let provider = maskCGImage.dataProvider,

let croppedImage = cgImage.cropping(to: CGRect(x: (image.size.width - maskSize.width) / 2,

y: (image.size.height - maskSize.height) / 2,

width: maskSize.width,

height: maskSize.height)),

let cgMask = CGImage(width: maskCGImage.width,

height: maskCGImage.height,

bitsPerComponent: maskCGImage.bitsPerComponent,

bitsPerPixel: maskCGImage.bitsPerPixel,

bytesPerRow: maskCGImage.bytesPerRow,

space: CGColorSpaceCreateDeviceGray(),

bitmapInfo: CGBitmapInfo.byteOrderMask,

provider: provider,

decode: nil,

shouldInterpolate: true,

intent: CGColorRenderingIntent.defaultIntent),

let maskedImage = croppedImage.masking(cgMask),

let maskedImageAtTargetSize = UIImage(cgImage: maskedImage).scaled(

by: targetHeightForFace / CGFloat(maskedImage.height)

)

{


maskImage is an oval-shape created by rendering a CAShapeLayer in a UIGraphicsContext. My goal is to apply this mask shape to an image using the masking(mask) function.


Also, all of the prior conditions to the let cgMask one succeed... so the dataProvider property is definitely non-nil

Could you test all values in the init ?


if
            let cgImage = image.cgImage,
            let maskCGImage = maskImage?.cgImage,
            let provider = maskCGImage.dataProvider,
            let croppedImage = cgImage.cropping(to: CGRect(x: (image.size.width - maskSize.width) / 2,
                                                           y: (image.size.height - maskSize.height) / 2,
                                                           width: maskSize.width,
                                                           height: maskSize.height)) {

     print(maskCGImage.width, maskCGImage.height, maskCGImage.bitsPerComponent, maskCGImage.bitsPerPixel, maskCGImage.bytesPerRow, CGColorSpaceCreateDeviceGray(), CGBitmapInfo.byteOrderMask, provider, CGColorRenderingIntent.defaultIntent)

            if let cgMask = CGImage(width: maskCGImage.width,
                                 height: maskCGImage.height,
                                 bitsPerComponent: maskCGImage.bitsPerComponent,
                                 bitsPerPixel: maskCGImage.bitsPerPixel,
                                 bytesPerRow: maskCGImage.bytesPerRow,
                                 space: CGColorSpaceCreateDeviceGray(),
                                 bitmapInfo: CGBitmapInfo.byteOrderMask,
                                 provider: provider,
                                 decode: nil,
                                 shouldInterpolate: true,
                                 intent: CGColorRenderingIntent.defaultIntent),
            let maskedImage = croppedImage.masking(cgMask),
            let maskedImageAtTargetSize = UIImage(cgImage: maskedImage).scaled(
                by: targetHeightForFace / CGFloat(maskedImage.height)
            )
{
print(maskCGImage.width)
print(maskCGImage.height)
print(maskCGImage.bitsPerComponent)
print(maskCGImage.bitsPerPixel)
print(maskCGImage.bytesPerRow)
print(CGColorSpaceCreateDeviceGray())
print(CGBitmapInfo.byteOrderMask)
print(provider)
print(true)
print(CGColorRenderingIntent.defaultIntent)


The above code gave the following output:


224

335

8

32

896

<CGColorSpace 0x283f506c0> (kCGColorSpaceDeviceGray)

CGBitmapInfo(rawValue: 28672)

<CGDataProvider 0x10cd3d8f0 [0x2450f95e0]>

true

CGColorRenderingIntent


I also checked provider.data and it is non-nil; and it looks like a normal array of RGBA pixel values.

Did you test as well which is nil ?


let maskedImage = croppedImage.masking(cgMask)

or

let maskedImageAtTargetSize = UIImage(cgImage: maskedImage).scaled(…)

Why are you using `CGBitmapInfo.byteOrderMask` ?

The value is intended for bit-masking to retreive only byte order info of a CGBitmapInfo, and is an invalid value in an actual CGBitmapInfo.


You need to create a valid CGBitmapInfo combining proper AlphaInfo, PixelFormatInfo, ByteOrderInfo and FloatInfo.


I'm not sure if it worked once or if it really is a cause of your issue, but you should provide a valid value representing your bitmap consistent with other parameters.

Having the same issue with this function

CGImage.init

returns nil on iOS 12, worked fine on previous os.

Maybe I found solution. Please change CGColorSpaceCreateDeviceGray to CGColorSpaceCreateDeviceRGB

Hi all, I was able to fix my problem by using this initializer instead. Thank you all for your help!