Custom renderpipeline & shader confusion

According to the video, the immersive app rendering is taken controlled by RealityKit.

Does it mean that the render pipeline is unable to be customized? For example, RendererFeature or modify the render pipeline directly.

For custom shaders, it seems that the shader graph is converted to MaterialX. The standard surface shader for MaterialX supports Subsurface and Coat. Does RealityKit support them? If so, How can these properties mapped by shader graph while URP Lit shader graph is without the built-in support for Subsurface or Coat?

Currently, Metal custom rendering and custom rendering in general are not allowed on visionOS, except in full immersive VR mode :(. For full immersive mode, you have to use compositorservices. There’s an example doc and wwdc talk ( https://developer.apple.com/videos/play/wwdc2023/10089/ ) but no sample project (yet)?

For passthrough mode, there is reason to believe that standalone app full AR passthrough mode could be updated in the future to support custom rendering too, but it’s not a given. Checkout the discussion I had here starting at this post to understand the current limitations:

https://developer.apple.com/forums/thread/731506?answerId=755464022#755464022

I’d suggest filing feature requests for support for custom rendering because I also think it‘s super important not to be limited to the default RealityKit renderer for passthrough mode.

They want to see use cases. Personally, I think custom rendering is a given.

Custom renderpipeline & shader confusion
 
 
Q