En the following code, I create and export a triangle to USDC using the MDLAsset export function. The exported USDC file is correct, but I don't see the material component. It seems like the material is not being exported. Did I miss something?
//new asset
MDLAsset *asset = [[MDLAsset alloc] initWithBufferAllocator:nil];
//------------------ vertex descriptor -------------------------------------------------------------------------
MDLVertexAttribute *atrPos = [[MDLVertexAttribute alloc] initWithName:@"position" format:MDLVertexFormatFloat3 offset:0 bufferIndex:0];
MDLVertexAttribute *atrNor = [[MDLVertexAttribute alloc] initWithName:@"normal" format:MDLVertexFormatFloat3 offset:12 bufferIndex:0];
MDLVertexAttribute *atrCol = [[MDLVertexAttribute alloc] initWithName:@"color" format:MDLVertexFormatFloat4 offset:24 bufferIndex:0];
//layout (12 + 12 + 16)
MDLVertexBufferLayout *lay = [[MDLVertexBufferLayout alloc] initWithStride:40];
//add descriptor and layaut
MDLVertexDescriptor *vdc = [[MDLVertexDescriptor alloc] init];
vdc.attributes = [NSMutableArray arrayWithObjects:atrPos, atrNor, atrCol, nil];
vdc.layouts = [NSMutableArray arrayWithObjects:lay, nil];
//-------------- material ---------------------------------------------------------------------------------------------------
MDLScatteringFunction *scatteringFunction = [MDLPhysicallyPlausibleScatteringFunction new];
MDLMaterial *material = [[MDLMaterial alloc] initWithName:@"matTest" scatteringFunction:scatteringFunction];
//--------------------- mesh ------------------------------------------------------------------------------------------------
//vertex (position, normal, and color RGBA)
static const float triangleVertexData[] =
{
0.0, 0.5, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0,
-0.5, -0.5, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0,
0.5, -0.5, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 1.0
};
//vertex buffer
int numVertices = 3;
int lenBufferForVertices = sizeof(triangleVertexData);
MTKMeshBuffer *mtkMeshBufferForVertices = (MTKMeshBuffer *)[[asset bufferAllocator] newBuffer:lenBufferForVertices type:MDLMeshBufferTypeVertex];
NSData *nsData_vertex = [NSData dataWithBytes:triangleVertexData length:lenBufferForVertices];
[mtkMeshBufferForVertices fillData:nsData_vertex offset:0];
//index
static uint16_t indices[] = {0, 1, 2};
//index buffer
int numIndices = 3;
int lenBufferForIndices = numIndices * sizeof(uint16_t);
MTKMeshBuffer *mtkMeshBufferForIndices = (MTKMeshBuffer *)[[asset bufferAllocator] newBuffer:lenBufferForIndices type:MDLMeshBufferTypeIndex];
NSData *nsData_indices = [NSData dataWithBytes:indices length:lenBufferForIndices];
[mtkMeshBufferForIndices fillData:nsData_indices offset:0];
//submesh
MDLSubmesh *submesh = [[MDLSubmesh alloc] initWithName:@"triangle"
indexBuffer:mtkMeshBufferForIndices
indexCount:numIndices
indexType:MDLIndexBitDepthUInt16
geometryType:MDLGeometryTypeTriangles
material:material];
//mesh
MDLMesh *mdlMesh = [[MDLMesh alloc] initWithVertexBuffer:mtkMeshBufferForVertices
vertexCount:numVertices
descriptor:vdc
submeshes:[NSArray arrayWithObjects:submesh, nil]];
//add mesh object to asset
[asset addObject:mdlMesh];
//export to usdc
BOOL res = [asset exportAssetToURL:[NSURL fileURLWithPath:filePath]];
Post
Replies
Boosts
Views
Activity
I am developing a fully immersive application in Metal for Apple Vision Pro. I would like to know how I can access tap and double-tap events from Objective-C.
Hello.
When I make a call to this function, ar_skeleton_create_neutral_pose_hand_skeleton, it is supposed to create a hand skeleton in a neutral pose. However, when I obtain the matrix for each of the joints using the function ar_skeleton_get_anchor_from_joint_transform_for_joint, it returns the identity matrix for all the joints. Is there something I'm missing?
Thanks in advance.
Hello.
I'm working with Metal in the AVP simulator. I want to perform tests with the hand tracking system, but it's generating the following error: "<ar_hand_tracking_provider_t: 0x60000296c1c0> is not supported on this device." The question is: Is it possible to simulate any type of hand tracking interaction in the AVP simulator?
Thanks in advance.
Hello.
I'm working with Metal in Apple Vision Pro, and I've assumed that I can use Mesh shaders to work with Meshlets. But when creating the RenderPipeline, I get the following error message: "device does not support mesh shaders". The test is on the simulator, and my question is: Will Apple Vision Pro support Mesh shaders on physical devices?
Thanks.
Hello, I want to use my own 3D format, and thus, utilize custom shaders in Metal. Are there any restrictions in VisionOS?