Post

Replies

Boosts

Views

Activity

Reply to for 420v, camera output CVPixelBuffer, Y channel value exceed the range [16, 235]
@galad87 "there are super black and super white, that were historically useful " -> more information is available ? "How are you converting from YpCbCr to RGB and back? " -> Yes, code as below. Because the metal texture is uint8(RGBA8), similarly I clamp the rgb result to uint8 after converting from YpCbCr to RGB while procssing by pure cpu . "Are you converting back to limited range or to full range." -> Both from YpCbCr to RGB and from RGB to YpCbCr , the formulas are using video-range as below. How should I handle this out-of-range situation? don't care ? int Y1 = 224 ; int Cb1 = 118 ; int Cr1 = 134 ; float R1 = 1.164 * (Y1 - 16) + 1.792 * (Cr1 - 128); float G1 = 1.164 * (Y1 - 16) - 0.213 * (Cb1 - 128) - 0.534 * (Cr1 - 128); float B1 = 1.164 * (Y1 - 16) + 2.114 * (Cb1 - 128); int Rint = (int)round(R1) ; int Gint = (int)round(G1) ; int Bint = (int)round(B1) ; Rint = fmin(Rint, 255); Rint = fmax(Rint, 0); Gint = fmin(Gint, 255); Gint = fmax(Gint, 0); Bint = fmin(Bint, 255); Bint = fmax(Bint, 0); float Y = 16 + 0.183 * Rint + 0.614 * Gint + 0.062 * Bint; float Cb =128 - 0.101 * Rint - 0.339 * Gint + 0.439 * Bint; float Cr =128 + 0.439 * Rint - 0.399 * Gint - 0.040 * Bint;
Feb ’24
Reply to AVCaptureVideoDataOutput video range value exceed the range
Yes, I found so. iphone XR , ios 17.3.1 For '420v' (video range), Cb, Cr is inside [16, 240], but Y is outside range [16, 235], e.g 240, 255 that will make after convert to rbg, rgb may be nagative number , and then convert back to yuv, yuv is different from origin yuv (the maxium difference will be 20) .... NSDictionary* options = @{ (__bridge NSString*)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange), //(__bridge NSString*)kCVPixelBufferMetalCompatibilityKey : @(YES), }; [_videoDataOutput setVideoSettings:options]; ..... - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection; { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CVPixelBufferRef pixelBuffer = imageBuffer; CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly); uint8_t* yBase = (uint8_t*)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0); uint8_t* uvBase = (uint8_t*)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1); int imageWidth = (int)CVPixelBufferGetWidth(pixelBuffer); // 720 int imageHeight = (int)CVPixelBufferGetHeight(pixelBuffer);// 1280 int y_width = (int)CVPixelBufferGetWidthOfPlane (pixelBuffer, 0); // 720 int y_height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer, 0); // 1280 int uv_width = (int)CVPixelBufferGetWidthOfPlane (pixelBuffer, 1); // 360 int uv_height = (int)CVPixelBufferGetHeightOfPlane(pixelBuffer, 1); // 640 int y_stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0); // 768 -- 64字节对齐 int uv_stride = (int)CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1); // 768 // 检查-y平面 if (TRUE) { for(int i = 0 ; i < imageHeight ; i++) { for(int j = 0; j < imageWidth ; j++) { uint8_t nv12pixel = *(yBase + y_stride * i + j ); // 要考虑对齐 //if (nv12pixel < 16 || nv12pixel > 235) { if (nv12pixel < 10 || nv12pixel > 250) { NSLog(@"%s: y panel out of range, coord (x:%d, y:%d), h-coord (x:%d, y:%d) ; nv12 %u " ,__FUNCTION__ ,j ,i // 注意这里 先'列x'后‘行y’ ,j/2, i/2 ,nv12pixel ); } } } } CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);}
Feb ’24
Reply to MallocCheckHeap is stack address or symbol address ?
the formated information is as followed: (underline is command result , $ is begining of command) --------- $atos -o ./DerivedData/Build/Products/Debug-iphoneos/xxxx.app.dSYM/Contents/Resources/DWARF/xxxx -arch arm64 -l 0x10225c000 0x10225c000 0x0000000100000000 (in xxxx) --------- $atos -o ./DerivedData/Build/Products/Debug-iphoneos/xxxx.app.dSYM/Contents/Resources/DWARF/xxxx -arch arm64 -l 0x10225c000 0x1aefaed70 __0x1aefaed70__ (nothing shown)
Dec ’21
Reply to Synchronization mechanism on Mixing Metal and OpenGL Rendering
Add more infomation: the follow patch is for demo on Mixing Metal and OpenGL Rendering in a View changes : just display Quad Rendered with GL to View AAPLMetalRenderer add internal texture(MTLTexture), first draw quad on this internal texture, and then draw to 'Interop Texture' CADisplayLink set up 1fps (so it's possible to see the phenomenon) result: you can see the first frame is black just after the app start up (the 'metal-draw' quad is black but outside is red) 0001-black-at-first-frame.patch
Nov ’21