I used the following code to decode the image in png format into the allocated memory block *imageData.
- (void)decodeImage:(UIImage*)image
{
GLubyte* imageData = (GLubyte*)malloc(image.size.width * image.size.height * 4);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef imageContext = CGBitmapContextCreate(imageData, image.size.width, image.size.height, 8, image.size.width * 4, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault);
CGContextDrawImage(imageContext, CGRectMake(0.0, 0.0, image.size.width, image.size.height), image.CGImage);
CGContextRelease(imageContext);
CGColorSpaceRelease(colorSpace);
int bytesPerRow = image.size.width * 4;
//you can log [330, 150] RGBA value here, then gets wrong alpha value;
int targetRow = 330;
int targetCol = 150;
u_int32_t r = (u_int32_t)imageData[targetRow*bytesPerRow*4 + targetCol*4 + 0];
u_int32_t g = (u_int32_t)imageData[targetRow*bytesPerRow*4 + targetCol*4 + 1];
u_int32_t b = (u_int32_t)imageData[targetRow*bytesPerRow*4 + targetCol*4 + 2];
u_int32_t a = (u_int32_t)imageData[targetRow*bytesPerRow*4 + targetCol*4 + 3];
free(imageData);
}
The CGBitmapInfo used is kCGImageAlphaPremultipliedLast | kCGBitmapByteOrderDefault.
The problem is that the alpha channel is lost in the decoded result:
For example, the RGBA values of the target png at [row, col] = [330, 150] are R = 240, B = 125, G = 106, A = 80.
If the decoding is correct, the expected result should be R = 75, G = 39, B = 33, A = 80 with AlphaPremultiplied.
However, after decoding the png on iOS17, the result is R = 75, G = 39, B = 33, A = 255, where the alpha values are all forced to 255.
Xcode Version 15.0 beta (15A5160n). iPhone14 Pro.
The png file: