Hi,
I'm trying to create an AR app based on ImageAnchor using RealityKit and RealityComposer.
I created the application as follows.
- Create a scene for ImageAnchor with RealityComposer
- Create swift source code (same as RealityKit template code) with XCode
In the above case, the app behaves as follows:
- 3D objects are displayed while the camera is capturing the image
- 3D objects disappear when the camera leaves the image
I know that ARKit is able to keep tracking and display 3D objects even when camera leaves the image.
How can I achieve this with Realitykit?
ARView.automaticallyConfigureSession is true by default and I think it works with ARWorldTrackingConfiguration.
In the case of ARWorldTrackingConfiguration, it was recognized to keep tracking and display 3D objects even when camera leaves the image.
(I remember with SceneKit that it was working that way).
I think that ARView.automaticallyConfigureSession is set to false and ARSession, configuration needs to be changed. However, I don't understand how the best practice is specifically to change.
Please let me know how to do.