Short description
See ( https://youtu.be/fD6af6MaFRo ) to directly see the issue.
Description
The way a device (iPhone or iPad) is handled affects the ARKit result (camera tracking or scene reconstruction, since both are linked). A device handled in portrait position won’t behave the same way as a device handled in landscape position. One position gives a correct result, while the other doesn't: the camera tracking or the reconstructed scene has a visible error.
Which handled position gives a correct result?
It depends on the movement that the device performs :
It behaves correctly when you perform a pan with the device handled in portrait, or a tilt with the device handled in landscape.
The issue can be visible when you perform a pan with the device handled in landscape, or a tilt with the device handled in portrait.
In reality the device is not limited to only horizontal or vertical movements, so both portrait and landscape handle positions return an incorrect result depending on how much the wrong move was performed relative to device handle position.
Why some people might not have noticed it
This issue might go under radars for many AR applications, since they tend to:
Focus on one area in space
Turn around in a room in portrait position
Do a lot of movement but without really needing to have a precise ARKit result
What could be the root cause of this
Since we have no access to ARKit, we can only try to guess.
What we deduced is that the device should not move along its vertical, so not going in the direction of its front camera or lightning/usb-c connection. What could be the difference between the vertical and horizontal of the device? The only element we noticed is the camera sensor orientation, since it is a rectangle.
How to reproduce the issue
Since the issue is inside ARKit, every application on the App Store that we tested has this issue. So you have two choices:
Create a test project
Create a minimal ARKit project.
Use ArView with the session configured with ARWorldTrackingConfiguration (the issue might be present in other mode, just not tested).
Enable SceneReconstruction.
Use the ArView debugOptions .showSceneUnderstanding.
Aggregate environment data to generate a mesh by moving the device around.
Disable SceneReconstruction.
Handle the device in landscape position.
Do some 360° horizontally in the previously scanned environment.
Expected result:
When moving around in an environment, the displayed mesh provided by showSceneUnderstanding option is correctly aligned with the environment.
Current result:
When moving around in an environment, the displayed mesh is positioned incorrectly.
Download an AR app on the App Store:
Example of AR application:
- SiteScape
An example for you to see the result.
(SiteScape v1.3 on iPhone 12 Pro iOS 15.5).
- MetaScan
- CamTrackAr
A little bit harder to see the issue since only an anchor can be used as a reference to see an offset.
Important note
This issue is critical for the application we develop, we can’t release it with this bug. We understand that fixing this bug can take some time so we are doing our best to be patient. But, this issue has been reported since 16 Jun 2021 on Feedback Assistant (FB9184883), so we are waiting for almost a year now.
We have absolutely no visibility on how this issue is handled. So, we tried to use the Developer Technical Support (DTS) to ask help. We sent a message to the Apple Developer Program Support. We asked on Feedback Assistant. We learned nothing. What is the priority of this issue? Is even someone assigned to this issue right now? Is one year not enough time to fix it?
So can we have at least some information? And hopefully have this bug finally fixed.