I'm exploring using the CARemoteLayerClient/Server API to render a layer from another process as is described in the docs, but can't seem to get a very simple example to work. Here's a very simple example of what I'd expect to work:
// Run with `swift file.swift`
import AppKit
let app = NSApplication.shared
class AppDelegate: NSObject, NSApplicationDelegate {
let window = NSWindow(
contentRect: NSMakeRect(200, 200, 400, 200),
styleMask: [.titled, .closable, .miniaturizable, .resizable],
backing: .buffered,
defer: false,
screen: nil
)
func applicationDidFinishLaunching(_ notification: Notification) {
window.makeKeyAndOrderFront(nil)
let view = NSView()
view.frame = NSRect(x: 0, y: 0, width: 150, height: 150)
view.layerUsesCoreImageFilters = true
view.wantsLayer = true
let server = CARemoteLayerServer.shared()
let client = CARemoteLayerClient(serverPort: server.serverPort)
print(client.clientId)
client.layer = CALayer()
client.layer?.backgroundColor = NSColor.red.cgColor // Expect red rectangle
client.layer?.bounds = CGRect(x: 0, y: 0, width: 100, height: 100)
let serverLayer = CALayer(remoteClientId: client.clientId)
serverLayer.bounds = CGRect(x: 0, y: 0, width: 100, height: 100)
view.layer?.addSublayer(serverLayer)
view.layer?.backgroundColor = NSColor.blue.cgColor // Background blue to confirm parent layer exists
window.contentView?.addSubview(view)
}
}
let delegate = AppDelegate()
app.delegate = delegate
app.run()
In this example I'd expect there to be a red rectangle appearing as the remote layer. If I inspect the server's layer hierarchy I see the correct CALayerHost
with the correct client ID being created, but it doesn't display the correct contents being set from the client side.
After investigating this thread: https://bugs.chromium.org/p/chromium/issues/detail?id=312462 and some demo projects, I've found that the workarounds previously found to make this API work no longer seem to work on my machine (M1 Pro, Ventura). Am I missing something glaringly obvious in my simple implementation or is this a bug?