Still looking for camera support.
I've seen the AVMultiCamera API's in iOS 13, but I'm still confused about getting calibration data from cameras, or what the limitations are to accessing the cameras(e.g. framerate, latency, resolution, color, etc.). It doesn't seem like that information is documented anywhere.
Are you interested in streaming video from two cameras simultaneously or capturing stills? Multiple photos from a DualCamera has been supported for several releases now and does not require an AVCaptureMultiCamSession. See dualCameraDualPhotoDelivery in AVCapturePhotoOutput (the inline documentation is excellent).
If you want to stream from two cameras simultaneously, then yes, you need to use AVCaptureMultiCamSession, and the 2019 wwdc session gchiste cited is a good source.
Assuming that you have configured your capture session correctly to produce two outputs, the captureOutput:didOutputSampleBuffer:fromConnection: will be called twice, once for each output.
You should check out this WWDC video: https://developer.apple.com/videos/play/wwdc2019/249/?time=691
The video also references and links to a sample project which demonstrates how to configure an AVCaptureMultiCamSession.