Posts

Post not yet marked as solved
5 Replies
2.8k Views
I wanted to make a very simple AR Viewer that is based on Reality Composer and RealityKit, following Apple's video "Building Apps With RealityKit" session 605. I started with an XCode Project the basic Box AR anchored to the floor, it has Reality Project loaded with a Box. That part is working fine.I modified the RC Project and adding my own USDZ asset.What I would like to know is just super basics:- To be able to register Tap Gesture to trigger my Reality Setup (run animation)- Ability to Modify Material, maybe changing color - Adding Occlusion Material into Entity So far I got to this by watching Apple's Video:- I cannot seem to get a print out of where I tap on screen. I thought it would be fairly simple. - I cannot seem to be able to see my object with Occlusion Material. I mean it is probably already in the scene but not sure where I anchor it exacly. What is wrong exactly with my code?import UIKit import RealityKit class ViewController: UIViewController { @IBOutlet var arView: ARView! override func viewDidLoad() { super.viewDidLoad() // Load the "Box" scene from the "Experience" Reality File let boxAnchor = try! Experience.loadBox() // Add the box anchor to the scene arView.scene.anchors.append(boxAnchor) // Occlusion Plane let planeMesh = MeshResource.generatePlane(width: 2.5, depth: 2.5) //let planeMesh = MeshResource.generateBox(size: 2.0) let material = OcclusionMaterial() let occlusionPlane = ModelEntity(mesh: planeMesh, materials: [material]) occlusionPlane.position.y = 1.5 // add to anchor boxAnchor.addChild(occlusionPlane) } // Hit Testing @IBAction func onTap(_ sender: UITapGestureRecognizer){ let tapLocation = sender.location(in: arView) if let blah = arView.entity(at: tapLocation){ print(blah.name) } } }
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.5k Views
Hi all, I am testing USD (Universal Scene Description) export from XCode. Hoping to export a scene in SceneKit into something that works inside Reality Composer.I thought it would be straightforward, but not quite.1. I started with XCode, default ship.scn I simply added a bunch of random geometry. With the box, I was actually hoping to be able to have AR that has special occlusion shader... but never mind on that.2. Export as USD from XCode.I converted the USD file format into USDZ using usdzip. Well, I am getting the ship but without texture. Further digging into the USD file, I realized: maybe I need to have the image texture file that I specified inside XCode. I copy the image file and specify a folder with name that matches the path. Once again I run usdzip on the same file.Ok now I am getting material with proper textures! Ideally, I would expect XCode to also export as USD but with all the textures etc. But never mind, this can be an advantage.Below is how it looks when using AR Quick Look on MacOS: (texture seems to work)NOTE: Fancy shaders do not seem to transfer with USD. I guess that does not matter for now, hoping that maybe in the future Apple makes it clear, allowing Occlusion shader or other custom shader inside Reality Composer. Like Particles glow or something.3. Testing the resulting USDZ inside Reality Composer and also in iOS.In iOS, apparently the ship does not have the textures.... same case with Reality Composer.At this stage, I think I found a bug. iOS 13 Beta 3 still incorrectly translates this. Or maybe it's the XCode or maybe USDZIP bug?
Posted Last updated
.
Post not yet marked as solved
0 Replies
432 Views
If I have a USDZ asset for Reality Composer, let say I have a furniture with Material as USDZ and I wanted to have the ability to control the color etc. Is this something that's possible via Reality Composer alone, or I need RealityKit?
Posted Last updated
.