runtime texture input for ShaderGraphMaterial?

For the MaterialX shadergraph, the given example hard-codes two textures for blending at runtime ( https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro#Build-materials-in-Shader-Graph )

Can I instead generate textures at runtime and set what those textures are as dynamic inputs for the material, or must all used textures be known when the material is created? If the procedural texture-setting is possible, how is it done, since the example shows a material with those hard-coded textures?

EDIT: It looks like the answer is ”yes” since setParameter accepts textureResources https://developer.apple.com/documentation/realitykit/materialparameters/value/textureresource(_:)?changes=l_7

However, how do you turn a MTLTexture into a TextureResource?

Post not yet marked as solved Up vote post of KTRosenberg Down vote post of KTRosenberg
788 views

Replies

May I have clarification on what the fastest way would be to convert a MTLTexture into a TextureResource to plug into the RealityKit APIs? This MTLTexture would be updated per-frame.