Is saving and reload the depthData captured between ARSessions possible with ARKit 4?

Say I want to have a setup phase in my app where I capture detailed depthData of my environment/indoor room.

Once this setup is complete I assume that the geometry of the room and larger objects in the room would not change much and generate a 3d mesh of the entire room.

After that I place a virtual object inside that mesh and allow it to randomly move in the space and observe the collisions accurately happening.

Can I somehow save and reload this depth data so that in a separate ARSession I only add on/update the existing depth data vs having to map it again?

Accepted Reply

Saving the depth data does not seem like the right approach. What makes more sense to me is saving the mesh which ARKit generates from the depth data. We introduced this in iOS 13.4 - you can set the ARWorldTrackingConfiguration.sceneReconstruction property to .mesh . So you could turn this on in your setup phase. Then, when setup is complete you would have to do two things:
  1. Save the mesh from all ARMeshAnchors in a 3D file format. E.g. the Wavefront OBJ format is a simple, human readable format for saving 3D geometries.

  2. Save the current ARWorldMap via ARSession.getCurrentWorldMap

When launching the second ARSession, you would first load the ARWorldMap by setting the configuration's .initialWorldMap property. Once ARKit has successfully relocalized to the world map of the first session, you can then load the mesh from the 3D file into your scene and e.g. generate a collision shape for it.


Here is a developer sample which shows how to use the meshing feature: Visualizing and Interacting with a Reconstructed Scene

And here is a developer sample which shows how to work with ARWorldMaps: Saving and Loading World Data.

Replies

Saving the depth data does not seem like the right approach. What makes more sense to me is saving the mesh which ARKit generates from the depth data. We introduced this in iOS 13.4 - you can set the ARWorldTrackingConfiguration.sceneReconstruction property to .mesh . So you could turn this on in your setup phase. Then, when setup is complete you would have to do two things:
  1. Save the mesh from all ARMeshAnchors in a 3D file format. E.g. the Wavefront OBJ format is a simple, human readable format for saving 3D geometries.

  2. Save the current ARWorldMap via ARSession.getCurrentWorldMap

When launching the second ARSession, you would first load the ARWorldMap by setting the configuration's .initialWorldMap property. Once ARKit has successfully relocalized to the world map of the first session, you can then load the mesh from the 3D file into your scene and e.g. generate a collision shape for it.


Here is a developer sample which shows how to use the meshing feature: Visualizing and Interacting with a Reconstructed Scene

And here is a developer sample which shows how to work with ARWorldMaps: Saving and Loading World Data.
Thank you for your reply. I will try that.
Hello,

We are currently working on trying to persist ARWorldMap with mesh information (from Lidar).

From what I see, ARMeshAnchors are not persisted in the ARWorldMap (But ARPlaneAnchors are persisted) when saving the worldmap.

Is this a bug or is it working as intended?


Thanks
Gil