Creating an AVCapture device on macOS?

I'd like to make a macOS app that composites one or more video camera streams into an output stream that can then be used by another app as a capture device for an AVCaptureSession. Is this possible? Looking through AVFoundation and Core Media docs, there's nothing obvious. I realize I may need to create a driver, that's fine, but I don't really know where to start.

Creating an AVCapture device on macOS?
 
 
Q