I am still working on an app to place simple 3D models on different places outdoor. I save the location (World Data) of the environment nearby and load and reconstruct the scene later on. I make use of the latest Apple device (iPhone 12 Pro) with the LiDAR scanner. Strange thing is that often you can't reconstruct the experience. Is the stored (LiDAR) data to accurate so that the scene has to be exactly the same? For example, could it be a problem if a flower leave was broken, so it's imposible to reconstruct?
In my case (example) I've created two seperated scenes. I placed one arrow model (.usdz) on a flowerpot and one on a statue. I saved both, checked by reloading (model was still there) and came back the next day. It was rainy that day. I couldn't reproduce the AR scene around the flowerpot but the statue was no problem. Is there a way to make the scene simpler to recognize? For example, is it better to add horizontal and vertical plane detection besides the meshes? Or change the way of using the world mapping status? Other solution could be: place more models (arrows), so that one of the Anchors should match.
Thanks in advance,
Marc