Post

Replies

Boosts

Views

Activity

How to access custom Mesh buffers inside a custom ShaderGraphMaterial?
My app is built using SceneKit/Metal on iOS, iPadOS, and tvOS, and users have generated tons of content. To bring that content to visionOS with fidelity, I have to port a particle emitter system. I have been able to successfully re-use my Metal code via CompsitorServices, but I'd like to get it working in RealityKit for all the UX/UI affordances it can provide. To that end, I have also been successful in getting the particle geometry data rendering via a Component/System that replaces mesh geometry content in real time. The last major step is to find a way to color and texture each particle via a ShaderGraphMaterial. Like any good particle emitter system, particle colors can change and vary over time. In Metal, the shader look like this: fragment half4 CocosFragmentFunctionDefaultTextureColor(const CocosFragData in [[stage_in]], texture2d<half> cc_MainTexture [[texture(0)]], sampler cc_MainTextureSampler [[sampler(0)]]) { return in.color * cc_MainTexture.sample(cc_MainTextureSampler, in.texCoord); } Basically I multiply a texture sample with a vertex color. Fairly simple stuff in GL shader-speak. So, how do I achieve this via ShaderGraphMaterial? In another post, I see that I can pass in vertex colors via a custom mesh buffer like so: let vertexColor: MeshBuffers.Semantic = MeshBuffers.custom("vertexColor", type: SIMD4<Float>.self) let meshResource = MeshDescriptor() meshResource[vertexColor] = ... Unfortunately, that doesn't appear to work for me. I'm sure I missed a step, but what I really want/need is a way to access this custom buffer from inside a ShaderGraphMaterial and multiply it against a sample of the texture. How? Any pointers, or sample code, or sample Reality Composer Pro project would be most appreciated!
1
3
724
Nov ’23