ARKit using Swift to communicate between devices

I have been studying ARKit for both back camera and also Face AR front camera on iPhone X. I kind of wish there is one device that do both. Well obviously the iPhone X have both dual camera at the back and true depth. I am wondering if Apple will releast 2018 iPad with some true depth back camera...


Anyways, from what I learn, a single device cannot do both calculation of front and back camera at the same time. It needs more than 1 device then.


I am thinking to somewhat if we can connect the two together, say I am using iPhone X looking at my face, but also at the same time I wanted to send the data into the iPad to do some kind of AR facial animation on my 3D character head projected as AR.


Just wondering what kind of Tool Kit that Apple provided to do this kind of multi devices talking to each other. Normally if I were to make devices talking to each other, I will be using OSC within same wifi network. But I wonder what Swift can do in this case.


I knew a few apps that somewhat connect iPhone and iPad or other devices via bluetooth or something, basically to take photos at the same time. Wondering if this is something that can be done easily?