1 Reply
      Latest reply on Mar 15, 2019 10:24 AM by _mc
      Gareth-Trinity Digital Level 1 Level 1 (0 points)

        On the simulator this function works fine.

        It also works fine if I remove the kCVPixelBufferIOSurfacePropertiesKey from the BufferPool attributes.

         

        Here is the code that works okay on simulator.

        I tried an iPhone 5 and iPad Air 2 - result is the same, -6662 (Alloc failed) is returned immediately.
        As there was no other memory being allocated it cannot be low on memory.

        I also tried various heights and widths including very small ones line 16 x 16 but same result.

        Only works if I remove the kCVPixelBufferIOSurfacePropertiesKey.

         

        But as I need to use the buffers with AVSampleBufferDisplayLayer to render to the screen I have to use this IOSurface property.

         

        Any ideas?

        This feels like maybe a permissions thing?

         

        I put this code directly in viewDidLoad() so it's as soon as the app starts but it still fails.

        Normally this code is executed in a video rendering task dispatched as a concurrent async task.

         

        As I say it works on simulator just never on target device.

         

        Totally lost, does any one have any idea why this might be failing?

        Perhaps I am missing some key attributes?

         

          let pool_attrs = CFDictionaryCreate(nil, nil, nil, 0, nil, nil)

         

          let pix_buf_attrs = [

          kCVPixelBufferWidthKey as String : 128,

          kCVPixelBufferHeightKey as String : 128,

          kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_420YpCbCr8BiPlanarFullRange,

          kCMSampleAttachmentKey_DisplayImmediately as String : true,

          kCVPixelBufferIOSurfacePropertiesKey as String : true,

          ] as CFDictionary

         

          var pixel_buffer_pool: CVPixelBufferPool?

          let ret = CVPixelBufferPoolCreate(kCFAllocatorDefault, pool_attrs, pix_buf_attrs, &pixel_buffer_pool)

          if ret != 0 {

          print("RET : \(ret)")

          }

         

          /

          * Get a pixel buffer into which we write YUV data

          */

          var pixel_buffer: CVPixelBuffer?

          let y = CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, pixel_buffer_pool!, &pixel_buffer)

          if y != 0 {

        // always returns -6662 when kCVPixelBufferIOSurfacePropertiesKey is set

          }

        • Re: CVPixelBufferPoolCreatePixelBuffer returns -6662 on target device
          _mc Level 1 Level 1 (0 points)

          Although this post has some time, maybe we are experiencing an issue that can be related to this in CoreML model execution.

           

          When we execute

           

               VNImageRequestHandler.perform

           

          it throws "Could not create buffer with format BGRA" with error code -6662 which Apple describes as kCVReturnAllocationFailed (Memory allocation for a buffer or buffer pool failed.). Our best friend to help us solving problems, Google search, is not helping... this post was the only relevant one.

           

          Same mode is executed in two different procects and we do verify the following.

           

          1 - It happens in Project 2 and never in Project 1

           

          2 - it happens only on devices with iCloud with the option to optimize space

           

           

          Gareth-Trinity Digital did you manage to understand this and solve it?