Controlling iPhones from Mac Studio

The need is to control the cameras, and LIDAR, on up to four iPhone 14 Pros from at Mac Studio they are connected to. The data from the iPhones is to be uploaded from the iPhone 14s to the MacStudio for processing. What I need help with is:

  1. How can the iPhone 14s connected to MacStudio be enumerated on the Mac Studio to know how many there, are and their communication handles?

  2. How can the camera application that takes the photos, and the other that takes LIDAR data, be communicated with over the Lighting Cables to send commands from the Mac Studio, and to transfer the image, and lidar, data files?

1 The iPhones will have to route commands sent over lighting cables from the Mac Studio to the appropriate application for camera, and LIDAR. What is the best way to do this routing?

To save time I will be basing the software on the iPhones on these sample projects which I will modify to meet the project's needs:

https://developer.apple.com/documentation/avfoundation/capture_setup/avcam_building_a_camera_app

https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera

Parameters I need to directly control from the Mac Studio are exposure times, F stops, which of the iPhones will flash. Each connected iPhone's camera must be triggered as simultaneously as possible.

I am very new to Swift, and OSX. I have read the Swift manual, and done the iOS App Dev Tutorials.

Does the Multipeer Connectivity framework support peer to peer connectivity over a lighting cable between a Mac Studio, and iPhone 14?

Controlling iPhones from Mac Studio
 
 
Q