According to the video, the immersive app rendering is taken controlled by RealityKit.
Does it mean that the render pipeline is unable to be customized? For example, RendererFeature or modify the render pipeline directly.
For custom shaders, it seems that the shader graph is converted to MaterialX. The standard surface shader for MaterialX supports Subsurface and Coat. Does RealityKit support them? If so, How can these properties mapped by shader graph while URP Lit shader graph is without the built-in support for Subsurface or Coat?
Post
Replies
Boosts
Views
Activity
I see the error below.
but incompatible platform (have 'iOS', need 'xrOS-sim')
I want to port my metaverse app with WebRTC to visionOS but I encountered some problem.
I've fixed the com.unity.webrtc to make it work in Unity editor already. But I can't compile the libwebrtc to visionOS.
Does anyone have such experience on it? Can anyone help? Thank you.