How To Resize An Image and Retain Wide Color Gamut

I'm trying to resize NSImages on macOS. I'm doing so with an extension like this.

extension NSImage {
    
    // MARK: Resizing
    
    /// Resize the image to the given size.
    ///
    /// - Parameter size: The size to resize the image to.
    /// - Returns: The resized image.
    func resized(toSize targetSize: NSSize) -> NSImage? {
        let frame = NSRect(x: 0, y: 0, width: targetSize.width, height: targetSize.height)
        guard let representation = self.bestRepresentation(for: frame, context: nil, hints: nil) else {
            return nil
        }
        let image = NSImage(size: targetSize, flipped: false, drawingHandler: { (_) -> Bool in
            return representation.draw(in: frame)
        })
        
        return image
    }

}

The problem is, as far as I can tell, the image that comes out of the drawing handler has lost the original color profile of the original image rep. I'm testing it with a wide color gamut image, attached.

This becomes pure red when examing the image result.

If this was UIKit I guess I'd use the UIGraphicsImageRenderer and select the right UIGraphicsImageRendererFormat.Range and so I'm suspecting I need to use the NSGraphicsContext here to do the rendering but I can't see what on that I would set to make it use wide color or how I'd use it?

Replies

Documentation:

If the pixel density or color space of the destination graphics context changes, though, the image representation throws away any caches and executes the block again to obtain a new version of the image.

While the NSCustomImageRep is of Generic RGB colorspace it doesn't produce any backing cgImage. Backing cgImage is only created when needed based on destination cgContext.

XCode quicklook (bug?):

<<CGColorSpace 0x600001288180> (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; sRGB IEC61966-2.1)>

Directly asking for cgImage

<<CGColorSpace 0x600002935aa0> (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; Color LCD)>

If you use the method to resize an image and assign it to NSImageView it will display it correctly.

For this reason I wouldn't recommend using NSImage drawingHandler to resize image. The only way is to set custom context inside the drawing handler)

The image representation copies the block and stores it for later use.

This is a huge memory waste if it's a bitmap image rep. E.g. you have 5000x5000 image hanging around in memory.

Xcode15 quicklook renders this as a one solid red color square on a P3 display, while the NSImageView displays it as a 2 different shades of red on a P3 display.

        let size = CGSize(width: 250, height: 250)
        let image = NSImage(size: size, flipped: false) { drawRect -> Bool in
            let rects = drawRect.divided(atDistance: drawRect.size.width/2, from: .minXEdge)
            NSColor(displayP3Red: 1.0, green: 0.0, blue: 0.0, alpha: 1.0).set ( )
            NSBezierPath(rect: rects.slice).fill()
            NSColor(red: 1.0, green: 0.0, blue: 0.0, alpha: 1.0).set()
            NSBezierPath(rect: rects.remainder).fill()
            return true
        }
        imageView.image = image