6 Replies
      Latest reply on Apr 24, 2020 2:27 PM by gchiste
      koushan Level 1 Level 1 (0 points)

        The new demo app for the lidar scanner uses RealityKit and the user is able to visualize meshes detected by the scanner:




        The meshes are visualized using 'showSceneUnderstanding' as a debug option:




        You can also occlude virtual content by accessing the sceneUnderstanding:




        I am trying to recreate these features using SceneKit but so far i've had no success. It looks like 'sceneUnderstading' doesn't seem to exist in SceneKit. There doesn't seem to be any documentation on this.


        Any help would be appreciated.


        • Re: SceneUnderstanding missing from SceneKit?
          Bobjt Apple Staff Apple Staff (100 points)

          Unfortunately, SceneKit doesn't support any features of the LiDAR scanner. So, you'll have to apply information from the scene mesh (ARMeshAnchors) in SceneKit yourself.

            • Re: SceneUnderstanding missing from SceneKit?
              koushan Level 1 Level 1 (0 points)

              Thanks for the response.

              Well that's a shame, i suppose it's only a matter of time before SceneKit becomes obsolete.

              Do you know of any guides or articles explaining how to create SCNGeometry using the ARMeshAnchors?

              ARMeshAnchors use ARGeometrySource & ARGeometryElement and so far i haven't been able to convert them into SCNGeometrySource and SCNGeometryElement.

                • Re: SceneUnderstanding missing from SceneKit?
                  gchiste Apple Staff Apple Staff (320 points)

                  Here is a convenient class which wraps ARMeshGeometry to provide an SCNGeometry of the mesh (note this does not include the normals):


                      class ARSCNMeshGeometry {


                          let scnGeometry: SCNGeometry


                          init(meshAnchor: ARMeshAnchor) {


                              let meshGeometry = meshAnchor.geometry


                              // Vertices source

                              let vertices = meshGeometry.vertices

                              let verticesSource = SCNGeometrySource(buffer: vertices.buffer, vertexFormat: vertices.format, semantic: .vertex, vertexCount: vertices.count, dataOffset: vertices.offset, dataStride: vertices.stride)


                              // Indices Element

                              let faces = meshGeometry.faces

                              let facesData = Data(bytesNoCopy: faces.buffer.contents(), count: faces.buffer.length, deallocator: .none)

                              let facesElement = SCNGeometryElement(data: facesData, primitiveType: .triangles, primitiveCount: faces.count, bytesPerIndex: faces.bytesPerIndex)


                              scnGeometry = SCNGeometry(sources: [verticesSource], elements: [facesElement])







                  Using that class, you can get the SCNGeometry from the ARMeshGeometry for each mesh anchor.


                  Whenever an ARMeshAnchor is added to the session (use session(_:didAdd:) to check) you should create an SCNNode and add the geometry to it.

                  Then add the SCNNode to an SCNScene.


                  Whenever an ARMeshAnchor is updated (use session(_:didUpdate:) to check) you should update the corresponding node’s geometry with the anchor’s new geometry.

                  You should also set the node’s simdWorldTransform to the ARMeshAnchor’s transform at this time, since ARMeshGeometry vertices are located relative to their mesh anchor's transform.