Understanding MaterialX, USDShaders and Material workflows from Blender and other tools

Hi,

I've been exploring a project with visionOS, and have been quite confused on capabilities and workflows for using custom materials in RealityKit & RealityComposerPro for visionOS.

Ideally I would be able to create / load / modify a model and its materials in Blender, export to openUSD and have it load in fully in RCP, but this hasn't been the case. Instead different aspects of the material don't seem to be exported correctly and that has lead me to investigate more into understanding MaterialX, openUSD and Metal, and how they work in visionOS, RealityKit and RealityComposer.

MaterialX was announced as a primary format for working with 3D materials, but the .mtlx file format doesn't appear to be compatible in RCP directly - specifically trying materials provided in the AMD OpenGPU MaterialX Library. (note: AFAIK, Blender does not currently support MaterialX) When downloading a material, this provides a folder with the textures and corresponding .mtlx, but currently in RCP (Xcode 15.6 beta) this file is ignored. Similarly, trying to load it using ShaderGraphMaterial fails with 'Error in prim' and no other details that I can see.

It also appears that there is a way of bundling MaterialX files within an openUSD file (especially implied by the error about Prims), but I haven't been able to understand how this can be done, or if this is the correct approach. Unpacking the Apple-provided materials in RCP from usdz to usda, these appear to define the shaders in openUSD and reference the RCP MaterialX Preview Shader (presumably created using the Shader Graph). There is also reference however from the official MaterialX.org and OpenUSD around using a USD / MaterialX Plugin to enable compatibility.

I've also tried, and followed along with the introductory tutorial on the in-built ShaderGraph, and find it difficult to understand and quite different from Blender's Shader Nodes, but it currently appears that this is the primary way promoted to create and work with materials.

Finally, I had expected that CustomMaterials using Metal Shaders would be available, as Metal was mentioned for Fully Immersive Spaces, and 'Explore Advanced Rendering with RealityKit 2' from WWDC21 covers using custom shaders, but this is not listed as included in visionOS and according to an answer here, it's not currently planned. (although the documentation here still mentions Metal briefly)

Overall, what are the suggestions for workflows with materials for RealityKit on visionOS?

  • Is there a fully compatible path from Blender -> openUSD -> RealtyComposerPro? Do I need to export materials and models from Blender individually and rebuild them in RCP using the ShaderGraph?
  • Can I utilise existing MaterialX materials in RealityComposerPro, and if so, how?
  • Are there any other good resources for getting comfortable and understanding the nodes within the ShaderGraph? what WWDC talks would be good to revise on this?

Really appreciate any guidance!

Answered by Graphics and Games Engineer in 763349022

Thanks for the great question, @spacebarkid ! I'll try my best to answer, but full caveats that the state of support of features in Blender (and other DCC's) is constantly evolving and quite wide, so hopefully I don't miss anything.

Is there a fully compatible path from Blender -> openUSD -> RealtyComposerPro? Do I need to export materials and models from Blender individually and rebuild them in RCP using the ShaderGraph?

Yes, but not with MaterialX currently. Blender's material export into USD is currently a subject of discussion and development as @heckj mentioned. At this point I would recommend staying with just plugging in textures into the shader inputs, and not using any of the advanced Blender shader graph features, because there currently isn't a workflow on the Blender side that maps those nodes.

I should add that one of the great new features in Blender 4.x is the ability for add-ons to extend Blender's USD exports. This means that add-on developers may be able to help create these workflows sooner than a solution comes up that can be integrated into the Blender core. So there isn't anything to that effect yet, but it does open up a realm of oppurtunities.

Can I utilise existing MaterialX materials in RealityComposerPro, and if so, how?

I think it's important to distinguish between MaterialX as a graph (as can be represented inside USD) and MaterialX as a format (.mtlx xml files). Reality Composer Pro currently supports MaterialX graphs inside of USD files, but does not support mtlx files.

Both Houdini 19.5+ and Maya 2024+ should be able to author MaterialX graphs inside USD files today. My personal hope is that Blender will also do the same once the solutions in flight develop further.

With regards to AMD's great material library, it unfortunately is only available as mtlx files. Additionally they use the Autodesk Standard Surface shader, which is not supported by Reality Composer Pro, so a direct conversion would not help either.

But in the interest of providing information:

  1. If you want to convert the AMD shaders over, most of them are just textures connected into the end shading model. So they should be able to be plugged into your Blender or Shadergraph shaders and get the results quickly.

  2. If you had mtlx files that were using a shader type that Reality Composer Pro supports, you could use the USD tooling to convert them. This does mean you need a build of the USD libraries with MaterialX enabled. If you do have that, you can add a Material prim in your USD file, and reference the mtlx file. Then assign the material to a mesh and use usdcat --flatten input.usda -o output.usda

For the given materials, I'd recommend doing the first method.

Are there any other good resources for getting comfortable and understanding the nodes within the ShaderGraph? what WWDC talks would be good to revise on this?

I'd recommend watching the following two sessions: https://developer.apple.com/videos/play/wwdc2023/10083/ https://developer.apple.com/videos/play/wwdc2023/10202/

There's also documentation here https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro/


Hopefully those answers are useful.

As an aside, one way to set up your workflow so you can author materials in Reality Composer Pro, while iterating on your meshes in Blender is to do the following:

  1. Author your mesh in Blender and export it into your Reality Composer Pro's 'rkassets' folder.
  2. In your Reality Composer Pro scene, reference in that asset that you made.
  3. Assign a material to that reference

Now you can keep exporting your mesh into the same location, and see it update, but the material binding will remain. I find that kind of workflow helps when I'm constructing shaders that may be specific for my Vision Pro applications.

Cheers, and let me know if you have any follow up questions

  • Dhruv

Yup, materials exported from Blender within the model are majorly borked sometimes, I just try to recreate them in the node editor.

I’ve been hitting the same questions and quandaries - and like @ThiloJaeggi , the only consistent pattern i’ve managed is recreating shader effects in RealityComposer Pro and applying them there. Annoying AF after the effort expended in Blenders shader nodes, but I do get the complications with export.

Related to that, there is working pending to export blender shader nodes (or more specifically, some subset of them) as MaterialX, but as far as I can see it’s stalled in discussions of how to handle this inside blender eve. it comes to their internal renderers (eve and cycles), which currently don’t have materialX support.

I’m just starting to slowly increment through nodes to experiment with what can and can’t be exported, as I’d really prefer to use Blender as my “DCC” tool of choice.

Accepted Answer

Thanks for the great question, @spacebarkid ! I'll try my best to answer, but full caveats that the state of support of features in Blender (and other DCC's) is constantly evolving and quite wide, so hopefully I don't miss anything.

Is there a fully compatible path from Blender -> openUSD -> RealtyComposerPro? Do I need to export materials and models from Blender individually and rebuild them in RCP using the ShaderGraph?

Yes, but not with MaterialX currently. Blender's material export into USD is currently a subject of discussion and development as @heckj mentioned. At this point I would recommend staying with just plugging in textures into the shader inputs, and not using any of the advanced Blender shader graph features, because there currently isn't a workflow on the Blender side that maps those nodes.

I should add that one of the great new features in Blender 4.x is the ability for add-ons to extend Blender's USD exports. This means that add-on developers may be able to help create these workflows sooner than a solution comes up that can be integrated into the Blender core. So there isn't anything to that effect yet, but it does open up a realm of oppurtunities.

Can I utilise existing MaterialX materials in RealityComposerPro, and if so, how?

I think it's important to distinguish between MaterialX as a graph (as can be represented inside USD) and MaterialX as a format (.mtlx xml files). Reality Composer Pro currently supports MaterialX graphs inside of USD files, but does not support mtlx files.

Both Houdini 19.5+ and Maya 2024+ should be able to author MaterialX graphs inside USD files today. My personal hope is that Blender will also do the same once the solutions in flight develop further.

With regards to AMD's great material library, it unfortunately is only available as mtlx files. Additionally they use the Autodesk Standard Surface shader, which is not supported by Reality Composer Pro, so a direct conversion would not help either.

But in the interest of providing information:

  1. If you want to convert the AMD shaders over, most of them are just textures connected into the end shading model. So they should be able to be plugged into your Blender or Shadergraph shaders and get the results quickly.

  2. If you had mtlx files that were using a shader type that Reality Composer Pro supports, you could use the USD tooling to convert them. This does mean you need a build of the USD libraries with MaterialX enabled. If you do have that, you can add a Material prim in your USD file, and reference the mtlx file. Then assign the material to a mesh and use usdcat --flatten input.usda -o output.usda

For the given materials, I'd recommend doing the first method.

Are there any other good resources for getting comfortable and understanding the nodes within the ShaderGraph? what WWDC talks would be good to revise on this?

I'd recommend watching the following two sessions: https://developer.apple.com/videos/play/wwdc2023/10083/ https://developer.apple.com/videos/play/wwdc2023/10202/

There's also documentation here https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro/


Hopefully those answers are useful.

As an aside, one way to set up your workflow so you can author materials in Reality Composer Pro, while iterating on your meshes in Blender is to do the following:

  1. Author your mesh in Blender and export it into your Reality Composer Pro's 'rkassets' folder.
  2. In your Reality Composer Pro scene, reference in that asset that you made.
  3. Assign a material to that reference

Now you can keep exporting your mesh into the same location, and see it update, but the material binding will remain. I find that kind of workflow helps when I'm constructing shaders that may be specific for my Vision Pro applications.

Cheers, and let me know if you have any follow up questions

  • Dhruv
Understanding MaterialX, USDShaders and Material workflows from Blender and other tools
 
 
Q