Post

Replies

Boosts

Views

Activity

Reconstruction saved outdoor ARKit scenes
Hi, I am still working on an app to place simple 3D models on different places outdoor. I save the location (World Data) of the environment nearby and load and reconstruct the scene later on. I make use of the latest Apple device (iPhone 12 Pro) with the LiDAR scanner. Strange thing is that often you can't reconstruct the experience. Is the stored (LiDAR) data to accurate so that the scene has to be exactly the same? For example, could it be a problem if a flower leave was broken, so it's imposible to reconstruct? In my case (example) I've created two seperated scenes. I placed one arrow model (.usdz) on a flowerpot and one on a statue. I saved both, checked by reloading (model was still there) and came back the next day. It was rainy that day. I couldn't reproduce the AR scene around the flowerpot but the statue was no problem. Is there a way to make the scene simpler to recognize? For example, is it better to add horizontal and vertical plane detection besides the meshes? Or change the way of using the world mapping status? Other solution could be: place more models (arrows), so that one of the Anchors should match. Thanks in advance, Marc
1
0
1.1k
May ’21
Load AR Experience, LiDAR and environment light
Hello, I am working on an AR app. In this app I will make use of Apple's new LiDAR hardware. I make use of ARKit and RealityKit and want to place simple (.usdz) models. Most important is that I can save the data and recreate by loading the AR experience later on. There is not much demo material and documentation on this subject available. I've studied the examples:  Visualising and Interacting with a Reconstructed Scene, Saving and Loading World Data, and many more. At the moment I can save and load the data but there is something wierd. It seems that I need the same lightning  environment/sources to recreate the scene. Otherwise it will not work. When using LiDAR I thought that environment lightning is not important at all. I know that you needed light in the past to use PlaneDetection. Am I doing something wrong or are there special settings I need use? Apple support has no answer on this subject, so I hope that someone can help me out.  Thanks, Marc
2
0
1k
Apr ’21
Saving and Loading World data and LIDAR data
Hello, I was searching the web I could'nt find many tutorials, examples or demo's on this item. I am new into the AR world and it looks quiet complicated. There are so many frameworks to choose from (I choosed ARKit 4.0 and Realitykit). I want to create an app to use outdoor for tagging objects. I am using the LIDAR scanner for the accuracy when I place models (.usdz). I don't want to use snapshot images. I can place the models, make a walk and they still stay on the same place. Now I wan't to save the AR expierience but I don't know how test this by modifying Apple's SavingAndLoadingWorldData demo app. My question, is there a recent tutorial or demo who explains how to do this (using meshes, LIDAR, .usdz models)? I think that some world data will be saved but the (entity) anchors won't. Thanks in advance, Marc
0
0
669
Mar ’21