Which method should be used to achieve the most accurate world tracking experience with ARKit?

My company is developing an Augmented Reality app for a client, using ARKit, and they want the best world tracking experience on iOS. We have told them that this is not an exact science and that small variations are perfectly normal, but they want us to do anything possible to minimize errors, drifts and movements from holograms.


We have tested some ARHitTestResult examples on Unity and some others that use ARImageAnchor, but we can't decide on which is the best. It seems that using an Image Anchor improves tracking, but I am not sure if that is really the case or if this is just an optical illusion.


Please advise or share any Unity/Apple documentation on this matter.


Thanks

Replies

Hi there,


is image tracking even an option for your client's application? Because then this image needs to be present and visible in the scene to really improve tracking.


I would advise that in any case, if your tracking needs to be really good, make the app as lightweight / optimized as possible to reach 60Hz frame rate and interrupt the experience or warn the user when tracking becomes bad (see: https://developer.apple.com/documentation/arkit/artrackingstate?language=objc) . Additionally to the tracking state that arkit provides, you could write a custom script that checks the amount of feature points present in the view and warns the user if the number falls below a certain threshold.


Having said that, AR is still a relatively new technology on consumer devices, and I would try to explain to the client that "perfect" is something that will not be completely possible with current technology. Even with image tracking, what happens if the image is occluded? What happens if the user moves the device rapidly or when the light in the scene changes? ARKit gets better with every update, but it is still not perfect.


Best,

Flo

Hello,


To get the best "world tracking experience" with ARKit, you want to be using the ARWorldTrackingConfiguration. Additionally, to tell ARKit to optimize performance for specific areas of your session (i.e. The locations that you want your content to stay in place relative to the real world), you will want to add ARAnchorsat these locations!


Beyond what I've already mentioned (and as Flo already mentioned), you want to make sure that your app is highly performant/lightweight to minimize the likelihood that you will drop frames. If you drop frames, there is a good chance that your ARSession will be interrupted, which generally causes objects to start drifting until the session is able to recover.


The last thing that you can do is try to control the environment that the app is running in if possible. Make sure there is good and constant overhead lighting, make sure there are lots of detailed surfaces (not plain walls), and make sure that surfaces are not too reflective. Also, advise your user to not make extremely quick movements with the device.


As a failsafe, you should also have an always visible button that allows the user to completely reset the AR experience should things go awry!


If you haven't already, check out this WWDC video https://developer.apple.com/videos/play/wwdc2018/610/

It explains a lot of the things I mentioned above.