You can definitely do the raymarching part, but you’re right about there not being a way to output a custom depth value in a surface shader. If this is on a non-visionOS platform, you could try drawing your SDF in a full-screen post-processing pass with ARView’s PostProcessContext, which would give you access to the depth buffer produced by the rest of your scene to blend with as needed.
Post
Replies
Boosts
Views
Activity
Most of those look normal—if the volume of logging is a problem for you, please file RealityKit feedback. The last one (makeRenderPipelineState failed) might indicate an actual error; it would be worth checking whether your AR content is successfully casting shadows on the real-world environment, and if it’s not, definitely file feedback about that 🙂
I don’t see anything obviously wrong with your code. Depending on the size of your scene and how things are positioned, it could be an issue with your projection matrix’s “far” distance being too close, which would cause anything beyond (currently) 10 units to get clipped away. Can you share a screenshot of what “80%” looks like? Does it change as you move the camera?