high vertex count for ModelEntity for custom geometry shader

I am translating an app I have created from Metal to RealityKit because it was too difficult to hand-roll raycasting and transformations in pure Metal AR application. Instead of rendering camera feed and point cloud data onto a MTLView via command encoder, I need to use an ARView (via RealityKit) so that RealityKit can power the spacial transformations of the AR objects.

I am trying to use Custom Materials in RealityKit to render the metal geometry and surface shaders on top of a plane. However, this standard plane only has 4 vertices. For my geometry modifier to work, I need a couple thousand vertices on the plane.

However, when I made my own custom mesh using the code from codingxr.com (below), there is so much lag when running the app. I need lots of vertices so that I can access these vertices in the geometry shader modifier.

Is there a way to create a performant many thousand vertices mesh in RealityKit?

Or is there a way to make Metal and RealityKit to work in the same scene so that I can render to a MTKView but also take advantage of the power of ARView?

private func buildMesh(numCells: simd_int2, cellSize: Float) -> ModelEntity {
    var positions: [simd_float3] = []
    var textureCoordinates: [simd_float2] = []
    var triangleIndices: [UInt32] = []
    
    let size: simd_float2 = [Float(numCells.x) * cellSize, Float(numCells.y) * cellSize]
    // Offset is used to make the origin in the center
    let offset: simd_float2 = [size.x / 2, size.y / 2]
    var i: UInt32 = 0
    
    for row in 0..<numCells.y {
        for col in 0..<numCells.x {
            let x = (Float(col) * cellSize) - offset.x
            let z = (Float(row) * cellSize) - offset.y
            
            positions.append([x, 0, z])
            positions.append([x + cellSize, 0, z])
            positions.append([x, 0, z + cellSize])
            positions.append([x + cellSize, 0, z + cellSize])
            
            textureCoordinates.append([0.0, 0.0])
            textureCoordinates.append([1.0, 0.0])
            textureCoordinates.append([0.0, 1.0])
            textureCoordinates.append([1.0, 1.0])
            
            // Triangle 1
            triangleIndices.append(i)
            triangleIndices.append(i + 2)
            triangleIndices.append(i + 1)
            
            // Triangle 2
            triangleIndices.append(i + 1)
            triangleIndices.append(i + 2)
            triangleIndices.append(i + 3)
            
            i += 4
        }
    }
    
    var descriptor = MeshDescriptor(name: "proceduralMesh")
    descriptor.positions = MeshBuffer(positions)
    descriptor.primitives = .triangles(triangleIndices)
    descriptor.textureCoordinates = MeshBuffer(textureCoordinates)
    
    var material = PhysicallyBasedMaterial()
    material.baseColor = .init(tint: .white, texture: .init(try! .load(named: "base_color")))
    material.normal = .init(texture: .init(try! .load(named: "normal")))
    let mesh = try! MeshResource.generate(from: [descriptor])
    
    return ModelEntity(mesh: mesh, materials: [material])
}

Hi! I don't think there's a great way to make your high vertex count plane more efficient to render right now, the CustomMaterial API is quite expensive as is. However, you could look at using DrawableQueue and the postProcess APIs to potentially render something using Metal that is composited into your ARView.

Hi! Thanks for your reply!

Indeed, I discovered TextureResource.DrawableQueue. This is very promising. Currently, my app flow is like this: initiate ARView and register setup with metal device call back to set up the library and call a custom function to set up a custom MTLRenderPipelineState with my shader functions attached pointCloudRenderPipelineDescriptor?.vertexFunction = pointCloudVertexFunction   pointCloudRenderPipelineDescriptor?.fragmentFunction = pointCloudFragmentFunction.

On ARView (RealityKit) init, I also copy DrawableQueue to my TextureResource like this textureResource?.replace(withDrawables: self.colorDrawableQueue).

In the main draw function, I am passing a depth MTLTexture to the vertex shader function, and a color MTLTexture to the fragment shader function.

I am not sure how Drawable Queue enables for dynamic rendering. If I'm updating the textureResource only once on init (ie. textureResource?.replace(withDrawables: self.colorDrawableQueue) how does the texture update dynamically? Don't I need to replace the texture every frame?

Any detailed or advanced advice on DrawableQueue and dynamic texture updating would be great! BTW, I am familiar with Metal APIs. I was stumped for a while because I needed a replacement for MTKView.currentDrawable. It seems like TextureResource.DrawableQueue is the appropriate replacement. I'm just looking for some low-level examples or documentation or advice. Thanks a mill!

Also, is is realistic to have more than one drawable queue in one render pass? Because I need to call TextureResource.Drawable.present() just after I call

renderEncoder.endEncoding()
commandBuffer.commit()
commandBuffer.waitUntilCompleted()
drawable.present()
[secondDrawable.present()] ???
[thirdDrawabe.present()] ???
high vertex count for ModelEntity for custom geometry shader
 
 
Q