0 Replies
      Latest reply on Dec 4, 2019 1:44 PM by tova18
      tova18 Level 1 Level 1 (0 points)



        We are needing to do some very precise color analysis on raw data and are finding ourselves in some confusion over the raw pixel buffer.


        We are using the only raw format available: kCVPixelFormatType_14Bayer_RGGB, described as "Bayer 14-bit Little-Endian, packed in 16-bits, ordered R G R G... alternating with G B G B..."


        A print-out of the photo.pixelBuffer metadata from the didFinishProcessingPhoto callback shows the following:


        <CVPixelBuffer 0x281818140 width=4032 height=3024 bytesPerRow=8064 pixelFormat=rgg4 iosurface=0x282b140d0 attributes={

          PixelFormatDescription =  {

            BitsPerBlock = 16;

            BitsPerComponent = 8;

            ContainsAlpha = 0;

            ContainsGrayscale = 0;

            ContainsRGB = 1;

            ContainsYCbCr = 0;

            FillExtendedPixelsCallback = {length = 24, bytes = 0x000000000000000090fe27a3010000000000000000000000};

            PixelFormat = 1919379252;


        } propagatedAttachments={

        } nonPropagatedAttachments={



        This metadata suggests each color channel (aka component) in RGGB is 8 bits long, and a pixel (aka block) is 16 bits long.. so one pixel is comprised of two color channels (i.e. one pixel is one R and one G, or one G and one B). So far, this makes sense. However, the confusion begins when we consider each pixel is 14-bits, packed into 16-bits, in Little-Endian order..


        As an example, if the first four bytes at the pixelBuffer's baseAddress are 52 10 2c 0b, what is the value of the first R and the first G? This can be broken up into two separate questions:

        1) If a pixel block is 14 bits, packed into 16 bits.. where does the padding happen? Is each component actually 7 bits each, padded with a 0 at the most significant bit?

        2) Considering Little Endian order, does this mean the first pixel would be 1052, and the second would be 0B2C? If so, what is now our red channel vs our green channel?


        Furthermore, we have read that the iPhone camera has a 10 bit sensor.. so we'd actually expect 10 bits of data for each color channel, rather than 8 (much less 7).. so what's happening to those other bits of data from the sensor if this is supposed to be the raw photo output?


        And finally, can we assume the pixelBuffer contains only the image data, or are there any headers at the start that we need to consider?