Hey, I am having issues getting my Material X shaders to work properly in Reality Composer Pro that I've authored in Houdini.
The shader is very simple. It starts with a tiled image node that is written to the diffuse color of the preview surface node. This node is called mtxltileimage2.
When I create a tiled image node in RCP and configure it to have the same parameter values I get the texture to show up correctly. This node is called TiledImage.
One difference I can identify is that the second node has a grey icon whereas the first node has a blue icon. Could this be related to this issue?
Here is the USD viewer output for the two variants of the tiled image node.
Any pointers, misconceptions and help would be greatly appreciated. My goal is to be able and author these shaders in Houdini and import them into RCP. Trying to figure out the right pipeline for this workflow.
Post
Replies
Boosts
Views
Activity
Given that one can add custom components and expose them via RCP, how do I go about implementing my components / system in a way where when I make a parameter change that gets applied to the entitiy in the RCP viewport?
When i call queryDeviceAnchor in my Billboard system I get transform updates but I'm unsure how to process them (similar to the Diorama sample app).
Is it a bug that I recieve these updates? The documentation says that ARKit data is only provided in a full space so I would expect this not to work at all.
But if this is the case, why am I getting deviceAnchor values in this situation?
It appears as if custom fonts don't work in SwiftUI ViewAttachments on visionOS? I can confirm that the fonts work in other parts of the app that aren't inside attachments.
Is this a know bug or is there a funamental misunderstanding on my part?
In the screenshot I've attached below I would expect the blue box to be perpendicular to the floor. It is the yAxisEntity in my code which I instatniate with a mesh of height 3. Instead it runs parallel to the floor what I'd expect to the z axis.
Here is my code
struct ImmerisveContentDebugView: View {
@Environment(ViewModel.self) private var model
@State var wallAnchor: AnchorEntity = {
return AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: SIMD2<Float>(0.1, 0.1)))
}()
@State var originEntity: Entity = {
let originMesh = MeshResource.generateSphere(radius: 0.2)
return ModelEntity(mesh: originMesh, materials: [SimpleMaterial(color: .orange, isMetallic: false)])
}()
@State var xAxisEntity: Entity = {
let line = MeshResource.generateBox(width: 3, height: 0.1, depth: 0.1)
return ModelEntity(mesh: line, materials: [SimpleMaterial(color: .red, isMetallic: false)])
}()
@State var yAxisEntity: Entity = {
let line = MeshResource.generateBox(width: 0.1, height: 3, depth: 0.1)
return ModelEntity(mesh: line, materials: [SimpleMaterial(color: .blue, isMetallic: false)])
}()
@State var zAxisEntity: Entity = {
let line = MeshResource.generateBox(width: 0.1, height: 0.1, depth: 3)
return ModelEntity(mesh: line, materials: [SimpleMaterial(color: .green, isMetallic: false)])
}()
var body: some View {
RealityView { content in
content.add(wallAnchor)
wallAnchor.addChild(originEntity)
wallAnchor.addChild(xAxisEntity)
wallAnchor.addChild(yAxisEntity)
wallAnchor.addChild(zAxisEntity)
}
}
}
And here is what the simualtor renders
I have an app that launches into an immersive space with a mixed immersion style.
It appears like the Reality View has bounds that resemble a window. I would expect the bounds to not exist because it's an ImmersiveSpace.
Why do they exist? And how can I remove these?
This is the entire code:
@main
struct RealityKitDebugViewApp: App {
var body: some Scene {
ImmersiveSpace {
ContentView()
}
}
}
struct ContentView: View {
@State var logMessages = [String]()
var body: some View {
RealityView { content, attachements in
let root = Entity()
content.add(root)
guard let debugView = attachements.entity(for: "entity_debug_view") else { return }
debugView.position = [0, 0, 0]
root.addChild(debugView)
} update: { content, attachements in
} attachments: {
Color.blue
.tag("entity_debug_view")
}
.onAppear(perform: {
self.logMessages.append("Hello World")
})
}
}
As the title says. I don't believe this was covered in the Unity sessions and I can't find documentation.