We have some code that draws text to an NSImage, and the converts it to a NSBitmapImageRep so we can grab the data and use it as a texture. This has worked fine for many years and on thousands of customers machines:
image = [[NSImage alloc] initWithSize:size];
bitmap = [[NSBitmapImageRep alloc] initWithFocusedViewRect:rect];
//Generate texture using [bitmap bitmapData];
We now have a customer with a 2019 MacBook that is showing a corrupted texture. We were able to track it down to the NSBitmapImageRep being created with floating point samples. I see there's an option to initialize the NSBitmapImageRep with the exact format we need. Once we do that, how to we get the data from the NSImage into it? Or is there a way to convert the NSImage directly into the format we need?
I had the same problem when trying to draw a String into a texture. This code works for me on macOS 10.14 and macOS 10.15:
// let rect: NSRect // let frameSize: NSSize let backingScaleFactor = NSScreen.main!.backingScaleFactor let bitmap = NSBitmapImageRep(bitmapDataPlanes: nil, pixelsWide: Int(frameSize.width * backingScaleFactor), pixelsHigh: Int(frameSize.height * backingScaleFactor), bitsPerSample: 8, samplesPerPixel: 4, hasAlpha: true, isPlanar: false, colorSpaceName: .calibratedRGB, bytesPerRow: Int(frameSize.width * backingScaleFactor) * 4, bitsPerPixel: 32)! bitmap.size = frameSize let ctx = NSGraphicsContext(bitmapImageRep: bitmap)! NSGraphicsContext.saveGraphicsState() NSGraphicsContext.current = ctx let image = NSImage() image.draw(in: rect) ctx.flushGraphics() NSGraphicsContext.restoreGraphicsState()