Is front face tracking really NOT supported in a world tracking AR session (ARWorldTrackingConfiguration) on iPad Pro 2020?

I'm developing an app that requires use of the front-facing (TrueDepth) face tracking capabilities in an ARKit world tracking session configuration. This functionality (using the front and back cameras simultaneously in ARKit) has worked wonderfully since it was introduced with ARKit 3 last summer...on my iPhones with TrueDepth cameras. I received my new iPad Pro 2020 today, fired up my app, and was shocked to discover that

ARWorldTrackingConfiguration.supportsUserFaceTracking
returns false. This almost cannot be the case as it dramatically defeats the purpose of the device, which is to enable next-generation AR experiences. I've verified this with Apple's own demo app "Combining User Face-Tracking and World Tracking", which doesn't run on the iPad Pro. I have also verified that face tracking does work in the
ARFaceTrackingConfiguration
.


Can anyone explain this or offer a workaround or solution? It seems impossible that this crucial functionality isn't supported on this particular device.

Hello,


I experience the same issue.

I don't understand why it does not work, since the 2020 ipad has both FaceID and A12 processor.

Maybe it's an issue with orientation?


Anybody knows if this works on 2018 ipad pro?

With the 2018 11" iPad Pro I can confirm you're able to run world tracking and face tracking at the same time, i.e. I successfully ran the "Combining User Face Tracking and World Tracking" demo on my 2018 iPad Pro.


My 2020 iPad Pro arrives later this week so I'm unable to test that – but based off the feedback from Gil and jseymour, this seems to be an issue with the 2020 iPad Pro.

For what it's worth, I'm seeing this too (iPad Pro 2020, 11", iOS 13.4). I believe this is an unfortunate bug in iOS 13.4 as the previous generations of iPad Pro supported this fine, and these new generation devices have the requisite hardware (A12, TrueDepth camera).


Currently upadting to iOS 13.4.5 beta to see if it's fixed, but it'd be a shame to have worse ARKit support on this new device than the old one.


EDIT — Yeah, still an issue on iOS 13.4.5 😔

I have a bad-news update on this. I submitted the problem to Apple using Feedback Assistant and they have marked the case with "Resolution: Investigation complete - Works as currently designed". I don't get it.

What's your bug number (feedback ID), please?

Hi Bobjt, I believe it is FB7651009. Thank you so much for any help you can offer.

Ah, that's a real shame, but thanks for the heads up! The new iPads are very prominently marketed as having a TrueDepth and A12 processor (alongside dedicated AR hardware in the LiDAR sensor) — I kind of assumed the internal capability check hadn't been upadted to include the A12Z, but that's super strange that it was intentionally excluded.


I've also filed this as a bug, as it goes against all previously documented API capabilities. I'll let you know if I'm able to get some more clarification 👍

I can confirm that user-face tracking plus world tracking is not supported on the 2020 iPads. It's not intentional, but a current hardware limitation.

This is really disappointing.

I don't understand how the 2018 iPads pro have this feature and the 2020 iPad Pro does not.

What kind of hardware limitation? Aren't A12Z supposed to be upgraded versions of A12x?


Please tell me this can be fixed in with a software update.

Thanks for the insight Bobjt, much appreciated!


Understand that you probably can't say too much more, but is the intention to deprecate this API in the near future (or at least publicly document the lack of support for new devices)? Can we rely on the 2020 iPad Pros to support future ARKit APIs? It's a shame to hear that this is a hardware limitation, as it puts a bit of a dampener on an otherwise perfect AR development device.

I'll underscore that lack of support in 2020 iPads is unintentional; Apple is not deprecating the user-face tracking while world-tracking feature. It's a hardware limitation that results from the current LiDAR configuration. As a hardware limitation, a software update will not fix it.

Said above but reiterating here. Without talking about the future, this limitation is unintentional and there is no deprecation of the user face-tracking while world-tracking feature.

Thanks Bobjt but this is really an epic fail considering (in my opinion) the whole reason for the release of this device was so AR devs could run with it. This device's appeal for AR is stunted without simultaneous camera support. If they can't make a device that has both LiDAR and TrueDepth working together, then a lot of us with some really good ideas for next-gen apps will be...looking for other platforms??? Even a mode where the LiDAR is turned off to allow the preexisting functionality to work would have been appreciated. This has really set me back.

Has anyone tried with the new M1 iPad Pro 2021s?

Is front face tracking really NOT supported in a world tracking AR session (ARWorldTrackingConfiguration) on iPad Pro 2020?
 
 
Q