[Newbie] Why does my ShaderGraphMaterial appear distorted?

Disclaimer: I am new to all things 3D. There could be a variety of things wrong with what I'm doing that are not unique to RealityKit. Any domain info would be appreciated.


So, I'm following, what I think are, the recommended steps to import a shader-node material from reality composer pro and apply it to another modelEntity.

I do the following:

guard let entity = try? Entity.load(named: "Materials", in: RealityKitContent.realityKitContentBundle) else { return model }

let materialEntity = entity.findEntity(named: "materialModel") as? ModelEntity

guard let materialEntity else { return model }

I then configure a property on it like so:

guard var material = materialEntity.model?.materials[0] as? ShaderGraphMaterial else { return model }

try coreMaterial.setParameter(name: "BaseColor", value: .color(matModel.matCoreUIColor))

I then apply it.

This is what my texture looks like in RealityComposer:

I notice that my rendered object has distortions in the actual RealityView. Note the diagonal lines that appear "Stretched".

What could be doing this? I thought Node Shaders were supposed to be more resilient to distortions like this? I'm not sure if I've got a bug or if I'm using it wrong.

FWIW, this is a shader based on apple's felt material shader. My graph looks like this:

Thanks

Accepted Reply

So, it turns out that models have internal properties that dictate how image based materials apply to them. You set these properties by "UV-unwrapping" the model, and what surface area of the texture maps to what surface areas on the model.

For one reason or another I thought these UV properties were part of the material itself, but that would mean you couldn't use a material on multiple models, and that would be silly.

I fixed up the UV Mapping of my model with Blender's "Project from View (bounds)" tool, and it now looks the way I expect.

Replies

I'm looking int this more in Reality Composer pro and I'm seeing these distortions in RCP with the model that's programmatically receiving the texture.

I guess this could be a modeling issue now, as this texture applies fine to the sphere in reality composer pro.

So, it turns out that models have internal properties that dictate how image based materials apply to them. You set these properties by "UV-unwrapping" the model, and what surface area of the texture maps to what surface areas on the model.

For one reason or another I thought these UV properties were part of the material itself, but that would mean you couldn't use a material on multiple models, and that would be silly.

I fixed up the UV Mapping of my model with Blender's "Project from View (bounds)" tool, and it now looks the way I expect.