ARMeshGeometry to Model I/O

Hi community,

I am basically trying to export the ARMeshGeometry generated by the new SceneReconstruction API on the latest iPad Pro to an .obj-file.

ARMeshGeometry has an MTLBuffer to offer for the vertices and the indices which ultimately should be converted to an MDLMeshBuffer.


The attached code successfully generates an obj file but it only has one vertex, can anybody help me?

I am very new to Metal and Model I/O.


Thanks and stay healthy 🙂


guard let frame = arView.session.currentFrame else {
       return
}

let meshAnchors = frame.anchors.compactMap({ $0 as? ARMeshAnchor })

DispatchQueue.global().async {
    
     // Just the first mesh for testing purposes
    let anchor = meshAnchors[0]
    guard let device = MTLCreateSystemDefaultDevice() else {
        fatalError( "Failed to get the system's default Metal device." )
    }

    let vertices = anchor.geometry.vertices
    let faces = anchor.geometry.faces
    let allocator = MTKMeshBufferAllocator(device: device)
    
    // Convert buffers to MDLMeshBuffer
    let data = Data.init(bytesNoCopy: vertices.buffer.contents(), count: vertices.count, deallocator: .none)
    let vertexBuffer = allocator.newBuffer(MemoryLayout<SIMD3<Float>>.stride * vertices.count, type: .vertex)
    vertexBuffer.fill(data, offset: vertices.offset)

    // Convert Index-Buffer to MDLMeshBuffer
    let indexData = Data.init(bytesNoCopy: anchor.geometry.faces.buffer.contents(), count: faces.count, deallocator:.none)
    let indexBuffer = allocator.newBuffer(MemoryLayout<UInt32>.stride * faces.count, type: .index)
    indexBuffer.fill(indexData, offset: 0)
    
    // Create submesh for indexes
    let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                             indexCount: faces.count,
                             indexType: .uInt16,
                             geometryType: .triangles,
                             material: nil)
    

    let vertexFormat = MTKModelIOVertexFormatFromMetal(anchor.geometry.vertices.format)
    let vertexDescriptor = MDLVertexDescriptor()
    vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                        format: vertexFormat,
                                                        offset: 0,
                                                        bufferIndex: 0)
    let mdlMesh = MDLMesh(vertexBuffer: vertexBuffer,
                          vertexCount: vertices.count,
                          descriptor: vertexDescriptor,
                          submeshes: [submesh])

    let asset = MDLAsset()
    asset.add(mdlMesh)
    
    // Export MDLAsset to file
    let fileManager = FileManager.default
    var fileURL: URL

    do {
        let documentDirectory = try fileManager.url(for: .documentDirectory, in: .userDomainMask, appropriateFor:nil, create:false)
        fileURL = documentDirectory.appendingPathComponent("export.obj")
        try FileManager.default.removeItem(at: fileURL)
        
        try asset.export(to: fileURL)
    
    } catch let error as NSError {
        print("Error: \(error.domain)")
    }
}

Replies

Not sure if this is the only problem, but I think that indexCount should be faces.count * faces.indexCountPerPrimitive.

Did you find a solution to this? I've been experimenting with this for a day, trying to generate an .obj-file from the ARMeshGeometry, without success. Whatever I try, it ends up with only one vertex.

However, if I add normals to the mesh (addNormals(withAttributeNamed:creaseThreshold:)) they end up (seemingly correct) in the .obj-file, but still only one vertex.

Thanks for providing the starting point here, you piqued my interest.
There are a few things you need to consider here:

  • The Data() constructor takes the number of bytes, not objects, so you need to multiply the count by the stride.
  • SIMD3<Float> is 4 floats wide, not 3. The underlying data in this case is tightly packed triples of floats.
  • The indices are uInt32, not uInt16.


Here's my working version, with these issues corrected:

extension ARMeshGeometry {
    func toMDLMesh(device: MTLDevice) -> MDLMesh {
        let allocator = MTKMeshBufferAllocator(device: device);
        
        let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count);
        let vertexBuffer = allocator.newBuffer(with: data, type: .vertex);
        
        let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive);
        let indexBuffer = allocator.newBuffer(with: indexData, type: .index);
        
        let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                                 indexCount: faces.count * faces.indexCountPerPrimitive,
                                 indexType: .uInt32,
                                 geometryType: .triangles,
                                 material: nil);
        
        let vertexDescriptor = MDLVertexDescriptor();
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                            format: .float3,
                                                            offset: 0,
                                                            bufferIndex: 0);
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride);
        
        return MDLMesh(vertexBuffer: vertexBuffer,
                       vertexCount: vertices.count,
                       descriptor: vertexDescriptor,
                       submeshes: [submesh]);
    }
}