Posts

Post not yet marked as solved
2 Replies
794 Views
First of all, I apologize for such a general question, but my code is far too long to post. However, I have narrowed down my problems and am seeking advice for any solutions/workarounds that may help me. I recently updated my physics simulation code from using Metal Performance Shaders for ray tracing to using the (fairly) new Metal ray tracing routines directly. A few notes on my program: I perform the ray tracing entirely in a separate thread using Compute Kernels -- there is no rendering based on the ray tracing results. The compute kernels are called repeatedly and each ends with a waitUntilCompleted() command. The main thread is running a SwiftUI interface with some renderings that use RealityKit to display the scene (i.e., the vertices), and the rays that traverse the scene using Metal Ray Tracing. This is purely a Mac program and has no IOS support. So, the problem is that there seems to be some conflict between RealityKit rendering and the ray-tracing compute kernels where I will get a "GPU Soft Fault" when I run the "intersect" command in Metal. After this soft-fault error, my Ray Tracing results are completely bogus. I have figured out a solution to this which is to refit my acceleration structures semi-regularly. However, this solution is inelegant and probably not sustainable. This problem gets worse the more I render in my RealityKit display UI (rendered as a SwiftUI view) so I am now confident that the problem is some "collision" between the GPU resources needed by my program and RealityKit. I have not been able to find any information on what a "GPU Soft Fault" actually is although I suspect it is a memory violation. I suspect that I need to use fences to cordon off my ray tracing compute kernel from other things that use Metal (i.e., RealityKit), however I am just not sure if this is the case or how to accomplish this. Again, I apologize for the vague question, but I am really stuck. I have confirmed that every Metal buffer I pass to my compute kernel is correct. I did this confirmation by making my object a simple cube and having only one instance of this cube. Something happens to either corrupt the acceleration structure data or to make it inaccessible during certain times when RealityKit needs to use the GPU. Any advice would be appreciated. I have not submitted a bug report since I am still not sure if this is just my lack of advanced knowledge of multiple actors requiring GPU use or if there is something more serious here. Thanks in advance, -Matt
Posted
by rad.bobby.
Last updated
.
Post not yet marked as solved
1 Replies
317 Views
So Swift documentation states that when a == b, then a.hashValue == b.hashValue (but the opposite is not necessarily true). However, is there ever a case where the first statement isn't true? To voice the argument, I have been considering Foundation.Measurement. It is clear that: import Foundation let a = Measurement(value: 1.0, unit: UnitLength.meters) let b = Measurement(value: 100.0, unit: UnitLength.centimeters) a==b // true because obviously 100 cm is equal to 1 meter. However, it is less clear to me that a.hashValue == b.hashValue // true. This is TRUE a.value // 1.0 b.value // 100.0 So this seems like a case where a and b should be equal but have different hash values because they have different units. I am running into this because I am finding Measurement completely useless in SwiftUI when equality holds and two different (but equal) values have different units. Wanting to know people's thoughts. I am considering posting something in the Swift recommendations for changes but I don't know enough about this to be sure that this isn't an already discussed problem. -Matt
Posted
by rad.bobby.
Last updated
.
Post not yet marked as solved
3 Replies
632 Views
I am creating a RealityKit scene that will contain over 12,000 duplicate cubes arranged in a circle (see image below). This is for some high-energy physical simulations I am doing. I accomplish this scene by creating a single cube and cloning it a bunch of times. So, I there is a single MeshResource and Material even though there are a lot of entities. I have confirmed this by checking with Swift's === operator. Even with this, the program is unworkably slow. Any suggestions or tricks that could help with this type of scene? Using a single geometry was the trick to getting SceneKit to work fast with geometries like this. I've been updating my software to RealityKit because I far prefer the structure of RealityKit over SceneKit.
Posted
by rad.bobby.
Last updated
.
Post not yet marked as solved
1 Replies
453 Views
I am creating a ModelEntity with a programatic mesh. var mat = PhysicallyBasedMaterial() mat.baseColor = .init(tint: .lightGray) mat.roughness = 0.0 mat.metallic = 1.0 let newEnt = ModelEntity(mesh: try! .generate(from: [hex.generateMeshDescriptor()]), materials: [mat])     stages.worldStage.addChild(newEnt) where hex.generateMeshDescriptor() generates the custom mesh. On the screen, everything looks great. The material looks exactly like I expect it to. However, a simple, print shows that my ModelEntity has a single material in its list and its a SimpleMaterial, not a PhysicallyBasedMaterial. This causes my program to crash as I am using PhysicallyBasedMaterial's emissiveColor and emissiveIntensity later on to highlight the objects. print(newEnt.components[ModelComponent.self]!.materials) shows [RealityKit.SimpleMaterial(__resource: RealityKit.__MaterialResource, ... ...] I even try to force the material to be PhysicallyBasedMaterial and it doesn't stick. It's always somehow converted to a SimpleMaterial. I figured I would ask on the forums before I submit a bug report in case there is something basic that I am not understanding. Mac OS Monterey Beta 8. iMac Pro. Xcode 13 Beta 5. Thanks in advance. -Matt
Posted
by rad.bobby.
Last updated
.
Post not yet marked as solved
0 Replies
375 Views
I cannot tell if I have found a bug in simdRotate() or if I simply don't understand pivots very well. When I have a node and I change the pivot position of that node (or angle), then the routine SCNNode.simdRotate(by: simd_quatf, aroundTarget: simd_float3) doesn't seem to work the way I would expect. The following mac Playground illustrates my problem/question:import SceneKit import PlaygroundSupport // Make a new SCNView that responds to mouse events class TstView: SCNView { var cone: SCNNode? = nil override func mouseDown(with event: NSEvent) { print("\(event)") cone?.simdRotate(by: simd_quatf(angle: 0.0, axis: SIMD3<Float>(0.0,1.0,0.0)), aroundTarget: simd_float3(0.0,0.0,0.0)) } } // Simple cone let geo = SCNCone(topRadius: 3.0, bottomRadius: 0.0, height: 5.0) let cone = SCNNode(geometry: geo) // Make the pivot point the apex of the cone cone.simdPivot = simd_float4x4(SCNMatrix4MakeTranslation(0.0, -2.5, 0.0)) // make a Sphere to show the 0,0,0 point let sphere = SCNSphere(radius: 0.2) let centerNode = SCNNode(geometry: sphere) let scene = SCNScene() // Add my nodes scene.rootNode.addChildNode(cone) scene.rootNode.addChildNode(centerNode) // Starting point for the cone cone.simdPosition = SIMD3<Float>(-15.0, 0.0, 0.0) let view = TstView(frame: CGRect(x: 10.0, y: 10.0, width: 800.0, height: 800.0)) // Make the cone accessable in TstView view.cone = cone view.autoenablesDefaultLighting = true view.backgroundColor = NSColor.cyan view.scene = scene // start a live preview of that view PlaygroundPage.current.liveView = viewWhen you click the moust on the liveView, the cone starts walking forward!? Even though my angle of rotation is 0.0 radians. If you comment out line 20. ( the cone.simdPivot = ... line), then it works as I expect. Am I not understanding something here? Even with a modified pivot, I would think a rotation of 0 shouldn't change anything. Thanks for your help,-Matt
Posted
by rad.bobby.
Last updated
.
Post marked as solved
1 Replies
711 Views
I am trying to create ModelIO Meshes from SceneKit data. As a simple example, I see in the documentation for MDLMesh that:"Objects from SceneKitinit(scnGeometry: SCNGeometry, bufferAllocator: MDLMeshBufferAllocator?)Creates a mesh from the specified SceneKit geometry, using the specified allocator.init(scnGeometry: SCNGeometry)Creates a mesh from the specified SceneKit geometry."So I am at a loss as to why the following code doesn't work:import Foundation import ModelIO import SceneKit import MetalKit class WhyNotWorking { init() { let box = SCNBox(width: 10.0, height: 10.0, length: 10.0, chamferRadius: 0.0) let mesh = MDLMesh(scnGeometry: box, bufferAllocator: nil) } }This gives me an error that:Expression type 'MDLMesh' is ambiguous without more context. Looking into the MDLMesh.h file I don't see any referene to the initializer noted in the documentation.Am I missing something here?
Posted
by rad.bobby.
Last updated
.
Post not yet marked as solved
1 Replies
944 Views
First some background. I am running some physics simulations where I need to add physical properties (not display properties) to SCNNodes that contain the elemental material, e.g., Pb (lead), H2O (water), AlO3 (Aluminum oxide), etc. It looks to me like the init(geometry: SCNGeometry?) is a designated initializer because I don't see the word convenience in front of it.So I thought this would be easy:class PhysicalObject: SCNNode { var physicalMaterial: Material public init(geometry inGeom: SCNGeometry?, withMaterial: Material) { physicalMaterial = withMaterial super.init(geometry: inGeom) } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } }This tells me the I "Must call a designated initializer of the superclass 'SCNNode'". I don't know why this is the case since I don't see the word convenience in front of that SCNNode initializer. Question 1: Why doesn't the above work? I am assuming I am wrong about init(geometry: ) being a designated initializer but then how do we tell from Apple's documentation?So I found a work-around. I simply called super.init() and set the geometry manually. Not pretty but works. My only issue now is that I sometimes want to create a PhysicalObject from an SCNNode viapublic init(nodeToCopy: SCNNode, withMaterial: Material) { ... }For example, if I read in a .scn file and look for node names like "Object Pb 8.2". The "Object" tells me that this Node should be a PhysicalObject, the "Pb" tells me the material is lead, and "8.2" is the density. Question 2: How can I possibly initialize a PhysicalObject from an SCNNode when I am stuck calling the SCNNode.init() routine. I would have to copy all the important properties of the node over... geometry, transforms, children, etc. I am at a total loss as to how to accomplish this. When I initialize the scene withlet scene = SCNScene(named: "art.scnassets/input.scn")it creates an heirarchy of SCNNodes. I need to figure out how to convert some of them to PhysicalObjects. The clone() method isn't an option since it clones to an SCNNode. So I'm stuck at the same problem of how to convert an SCNNode to a PhysicalObject.Thanks in advance to all the experts for your assistance,-Matt
Posted
by rad.bobby.
Last updated
.