iPad Pro 2020, Can I use rear ARKit tracking with front camera video?

The ARKit API supports simultaneous world and face tracking via the back and front cameras, but unfortunately due to hardware limitations, the new iPad Pro 2020 is unable to use this feature (probably because the LIDAR camera takes a lot more power). This is a bit of a step back.


Here is an updated reference in the example project.


swift:

guard ARWorldTrackingConfiguration.supportsUserFaceTracking else {
    fatalError("This sample code requires
iOS 13 / iPad OS 13, and an iOS device with
a front TrueDepth camera. Note: 2020 iPads
do not support user face-tracking while world tracking.")
}



There is also a forum conversation proving that this is an unintentional hardware flaw.


It looks like the mobile technology is not "there yet" for both. However, has anyone confirmed whether this limitation extends to simply getting multiple video feeds, as opposed to using tracking on both cameras?


What I want to achieve: run a continuous world tracking session in ARKit and render the rear camera feed. At the same time, get front camera data using the regular video APIs, but don't do any tracking. Just process the front camera video frames using CoreML or vision APIs for other purposes.


The comment says "Note: 2020 iPads do not support user face-tracking while world tracking." This almost suggests the issue is related exclusively to *tracking*. Simultaneous front/back camera feed support was only made available in 2019, I believe. There's a new API for starting a session with both cameras. Since ARKit implicitly initializes one of the cameras, does this make it impossible to do what I want?


In short, can I use ARKit to do rear-camera world tracking and simultaneously receive and process front-camera data? If so, how? May I have a code example? I hope there is a solution to this.

As far as I know this is doable. If I had to guess the limitation is a power one using TrueDepth sensor (on the front cam) and Lidar (on the back cam).


Here are code samples for multicam:

https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avmulticampip_capturing_from_multiple_cameras

Did you manage to run ARKit using rear camera and simultaneously receive front camera data?
iPad Pro 2020, Can I use rear ARKit tracking with front camera video?
 
 
Q