How do I render a dynamically changing IOSurface to a CALayer or NSView?

I have a background process which is updating an IOSurface-backed CVPixelBuffer at 30fps. I want to render a preview of that pixel buffer in my window, scaled to the size of the NSView that's displaying it. I get a callback every time the pixelbuffer/IOSurface is updated.

I've tried using a custom layer-backed NSView and setting the layer contents to the IOsurface -- which works when the view is created but it's never updated unless the window is resized or another window is in front of it.

I've tried setting both my view and my layer SetNeedsDisplay(), I've tried changing the layerContentsRedrawPolicy to .onSetNeedsDisplay, I've tried making sure all my content and update code is happening on the UI thread, but I can't get it to dynamically update.

Is there a way to bind my layer or view to the IOSurface once and then just have it reflect the updates as they happen, or, if not, at least mark the layer as dirty each frame when it changes?

I've pored over the docs but I don't see a lot about the relationship between IOSurface and CALayer.contents, and when in the lifecycle to mark things dirty (especially when updates are happening outside the view).

Here's example code:
Code Block swift
class VideoPreviewThumbnail: NSView, VideoFeedConsumer {
  let testCard = TestCardHelper()
   
  override var wantsUpdateLayer: Bool {
    get { return true }
  }
  required init?(coder decoder: NSCoder) {
    super.init(coder: decoder)
    self.wantsLayer = true
    self.layerContentsRedrawPolicy = .onSetNeedsDisplay
    
/* Scale the incoming data to the size of the view */ 
    self.layer?.transform = CATransform3DMakeScale(
      (self.layer?.contentsScale)! * self.frame.width / CGFloat(VideoSettings.width),
      (self.layer?.contentsScale)! * self.frame.height / CGFloat(VideoSettings.height),
      CGFloat(1))
/* Register us with the content provider */
    VideoFeedBrowser.instance.registerConsumer(self)
  }
   
  deinit{
    VideoFeedBrowser.instance.deregisterConsumer(self)
  }
   
  override func updateLayer() {
/* ideally we woudln't need to do this */
    updateLayer(pixelBuffer: VideoFeedBrowser.instance.renderer.pixelBuffer)
  }
  
/* This gets called every time our pixelbuffer is updated (30fps) */
  @objc
  func updateFrame(pixelBuffer: CVPixelBuffer) {
    updateLayer(pixelBuffer: pixelBuffer)
  }
   
  func updateLayer(pixelBuffer: CVPixelBuffer) {
    guard let surface = CVPixelBufferGetIOSurface(pixelBuffer)?.takeUnretainedValue() else {
     print("pixelbuffer isn't IOsurface backed! noooooo!")
      return;
    }
/* these don't have any effect */
// self.layer?.setNeedsDisplay()
// self.setNeedsDisplay(invalidRect: self.visibleRect)
    self.layer?.contents = surface
  }
}


I just posted a similar question, but what I discovered was that i can just set the layer.contents to the CVPixelBuffer (no need to call CVPixelBufferGetIOSurface). I don't seem to need to call setNeedsDisplay or anything like that either... it just renders However, it only works on some of my iOS devices (an iPhone XS and an iPhone 11) while failing on my iPad Air 2.

I'll also note that I'm setting the layer contents in a CADisplayLink callback function, if that's of any relevance. Did you manage to figure this out?
The setNeedsDisplay docs claim it clears the contents. If you have anything updating over top of the view (I have an overlay that fades in and out) the underlying layer will update during the animation. Would be so amazing to figure out how to have IOSurface backed pixelbuffer update the screen.

I have a use case where I want to use the pixelbuffer elsewhere but also display it to the screen @ 30fps. I can blit it or do a number of other methods to display it (wrap it in CIImage and UIImage) but this seems to be the whole point of the IOSurface - easily displayed shared memory. Why doesn't this work out of the box and why is there no documentation?
re: contents
"If the layer object is tied to a view object, you should avoid setting the contents of this property directly. The interplay between views and layers usually results in the view replacing the contents of this property during a subsequent update"

I've created a separate layer that is not the View's layer and it does not address this

I have found that setting the contents property of a CALayer is an effective drawing technique but when used with IOSurfaceRef or CVPixelBufferRef it is necessary to double buffer the incoming surfaces because if you set the CALayer.contents twice in a row with the same CVPixelBufferRef it does not display the updated contents.

On earlier versions of Mac OS it seems like the color matrix attachments of the CVPixelBuffer are not interpreted or ignored so colors may be off but on recent versions of MacOS the color matrix attachments of the CVPixelBuffer will be applied when rendering the CALayer.contents making this a very powerful technique and frankly eliminating huge amounts of drawing code in Metal or OpenGL.

Did you ever get this to work? I'm in the same boat. I have a semi-transparent window over the whole screen, which I'm trying to draw at @ 30fps, but I can't seem to force the CALayer to update when I blit to the CVPixelBuffer. I will see the first couple frames, but then it stops updating.

Observations:

  • Strangely, the layer will update if I interact with another part of my app, like while dragging a slider in the preferences window.
  • I can also force the layer to update by setting layer.contents = nil; layer.contents = buffer or I can use 2 buffers and continually set layer.contents to every other buffer, but this doesn't display smoothly.

I just want something more efficient than setting an NSImageView 30x a second. Using [NSColor colorWithPatternImage:] works, but for the life of me I can't understand why this CVPixelBuffer isn't working, it's pretty simple stuff.

Here's how I'm defining my CVPixelBuffer:

NSDictionary *d = [NSDictionary dictionaryWithObjectsAndKeys:
                           @{}, kCVPixelBufferIOSurfacePropertiesKey,
                           @YES, kCVPixelBufferCGBitmapContextCompatibilityKey,
                           @YES, kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey, nil];
CVPixelBufferRef _buffer;
CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)d, &_buffer);
_overlay.contentView.layer.contents = (__bridge id _Nullable)(_buffer);

I've also tried _overlay.contentView.layer.contents = (__bridge id _Nullable)CVPixelBufferGetIOSurface(_buffer);, but I get the same behavior.

The code to update the CVPixelBuffer getting called at 30fps by an NSTimer (main thread):

CVPixelBufferLockBaseAddress(_buffer, 0);
void *rasterData = CVPixelBufferGetBaseAddress(_buffer);
memcpy(rasterData, bitmapData, bytes);
CVPixelBufferUnlockBaseAddress(_buffer, 0);

None of these worked:

[_overlay.contentView setLayerContentsRedrawPolicy:NSViewLayerContentsRedrawOnSetNeedsDisplay];
[_overlay.contentView.layer setNeedsDisplay];
[_overlay.contentView.layer display];
[_overlay.contentView display];
[_overlay.contentView setNeedsDisplay:YES];
How do I render a dynamically changing IOSurface to a CALayer or NSView?
 
 
Q