Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Reality Composer Documentation

Pinned Posts

Posts under Reality Composer tag

69 Posts
Sort by:
Post not yet marked as solved
0 Replies
342 Views
Hi, I'm working on an AR App. With Reality Composer and Xcode 14 I was triggering custom behaviours from with just: myScene.notifications.myBox.post() called from let myScene = try! Experience.loadBox() Now in Xcode 15 I don't have an Experience instead with the .reality file I have to use Entities so : let objectAR = try! Entity.load(named: "myProject.reality") How can I trigger my previously Reality Composer exported custom behaviour from that ?
Posted Last updated
.
Post not yet marked as solved
1 Replies
509 Views
Hello, Currently working on a project that was finished in Reality Composer but then we noticed the pink material after changing scenes on iOS 17 devices. So did some updating to Mac OS 14 and X Code 15 beta to use Reality Composer Pro and currently stuck on how to setup the animations and onClick triggers to be able play animation from the USDZ model in the scene. Once the animation is finished, it will trigger the next scene. This was done through behaviors in Reality Composer and it was simple drag and drop. But now it seem we need to do it by components which i don't mind just don't see much resources on how to set this up properly. Is there are way to do behaviors like in Reality Composer? Extra: If there is a way to use alpha pngs or be able to drag PNG's into the scene like in Reality Composer?
Posted Last updated
.
Post not yet marked as solved
2 Replies
857 Views
With Xcode 11 through 14, Reality Composer for Mac has been included with Xcode. Apples's own documentation states: "You automatically get Reality Composer for macOS when you install Xcode 11 or later. The app is one of the developer tools bundled with Xcode. From the menu, choose Xcode > Open Developer Tool, and select Reality Composer." This simply not the case with Xcode 15. In fact, we cannot even open existing RCPROJECTS such as the classic Experience.rcproject found in Apple's own sample code Creating a Game with Reality Composer. As an AR developer this makes my currently project virtual unworkable. It's my understanding that when Reality Composer Pro is finally released -- it will only be compatible with visionOS apps, as opposed to iOS apps that are built with RealityKit; this was certainly the case with early iterations. Apple's Creation tools for spatial apps still mention Reality Composer, but it is only available for iOS and iPadOS. Taking away such a critical tool without notice is crippling, to say the least. Any official announcements on the future state RealityKit development would be welcome.
Posted
by Natooka.
Last updated
.
Post not yet marked as solved
0 Replies
309 Views
I’m interested in evaluating the Physics capabilities of RealityKit and VisionOS. i assume that I could create Entities, add the PhysicBody Component and “simulate” and tweak settings interactively - but that’s not my experience. Is something like this possible with Beta 8?
Posted Last updated
.
Post marked as solved
2 Replies
517 Views
hi I have made a simple textured planet 3D in Reality Composer Pro, and the question is how can I visualize it in the Vision Pro simulator. I already have everything installed, even the simulator works fine, but I don't know how to run what I do in RealityComposerPro, export it to vision Simulator. Any comment will be very appreciated. thank you.
Posted
by mauroad.
Last updated
.
Post not yet marked as solved
3 Replies
1.5k Views
Hello, I created in Blender a simple cube with 2 animations, one animation move up and down the cube and second one rotating cube on his position. I exported this file in glb format and I tried to converted using Reality Converter, unfortunately I can only see 1 animation. Is there any limitation of Reality Converter? Can I include more than 1 animation? The original file glb has the 2 animation inside, as you can see from the screenshot I checked the file using a online viewer for glb and there are no problem, both animations are in. The converter unfortunately see only the last one created. Any reason or explanation? I believe is a limitation on Reality Converter Regards
Posted
by dm1886.
Last updated
.
Post not yet marked as solved
1 Replies
701 Views
Hi! While waiting for scene understanding to work in the visionOS simulator, I am trying to bounce a virtual object off a real world wall or other real world object on iOS ;). I load my virtual objects from a Reality Composer file where I set them to participate in physics with dynamic motion type. With this I am able to have them collide with each other nicely, occlusion also works, but they go right through walls and other real world objects rather than bouncing off... I've tried a couple of variations of the following code: func makeUIView(context: Context) -> ARGameView { arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.physics) arView.debugOptions.insert(.showSceneUnderstanding) arView.automaticallyConfigureSession = false let config = ARWorldTrackingConfiguration() config.planeDetection = [.horizontal, .vertical] config.sceneReconstruction = .meshWithClassification arView.session.run(config) if let myScene = try? Experience.loadMyScene() { ... arView.scene.anchors.append(myScene) } return arView } I have found several references that his should "just work", e.g. in https://developer.apple.com/videos/play/tech-talks/609 What am I missing? Testing on iPhone 13 Pro Max with iOS 16.5.1 🤔
Posted
by RK123.
Last updated
.
Post not yet marked as solved
0 Replies
459 Views
I'm trying to make a material that has flecks of glitter in it. The main technique I've found to achieve this effect is to use a Voronoi diagram as a normal map, with various amounts of embellishment on top. The shader graph editor has a Worley noise node which is related but produces the "spider web" version of a Voronoi diagram, instead of flat polygons of a consistent color. Is there a trick for converting this Worley texture into a vanilla voronoi diagram, or am I missing something else obvious? Or is what I want not currently possible?
Posted
by fzndvl.
Last updated
.
Post not yet marked as solved
1 Replies
761 Views
In RealityComposerPro, I've set up a Custom Material that receives an Image File as an input. When I manually select an image and upload it to RealityComposerPro as the input value, I'm able to easily drive the surface of my object/scene with this image. However, I am unable to drive the value of this "cover" parameter via shaderGraphMaterial.setParameter(name: , value: ) in Swift since there is no way to supply an Image as a value of type MaterialParameters.Value. When I print out shaderGraphMaterials.parameterNames I see both "color" and "cover", so I know this parameter is exposed. Is this a feature that will be supported soon / is there a workaround? I assume that if something can be created as an input to Custom Material (in this case an Image File), there should be an equivalent way to drive it via Swift. Thanks!
Posted
by omkar1799.
Last updated
.
Post not yet marked as solved
3 Replies
1.4k Views
I'm just starting to think about diving into augmented reality. Any suggestions on where I should start? For example, any good tutorials or white papers out there to get the get started with the basics and get some MVPs cranked out?
Posted Last updated
.
Post not yet marked as solved
1 Replies
667 Views
When I create an USDZ file from the original Reality Composer (non pro) and view it in the Vision OS simulator, the transforms and rotations don't look similar. For example a simple Tap and Flip behaviour does not rotate similar in Vision OS. Should we regard RC as discontinued sw and only work with RC-pro? Hopefully Apple will combine the features from the original RC into the new RC pro !
Posted
by Mstel.
Last updated
.
Post not yet marked as solved
0 Replies
490 Views
What would be the best way to go about recognizing a 3D physical object, then anchoring digital 3D assets to it? I would also like to use occlusion shaders and masks on the assets too. There's a lot of info out there, but the most current practices keep changing and I'd like to start in the right direction! If there is a tutorial or demo file that someone can point me to that would be great!
Posted
by Gnarboss.
Last updated
.
Post not yet marked as solved
0 Replies
472 Views
Just trying to figure out how to make an occlusion mask in Reality Composer or Reality Converter. I use Blender to make my assets and want to be able to make something similar to a portal which would require an occlusion mask.
Posted
by Gnarboss.
Last updated
.
Post not yet marked as solved
1 Replies
664 Views
Hi Everyone, I'm having difficult detecting custom components from RealityComposer in my xrOS app using QueryPredicate(where: ). I've got it working with a PhysicsBodyComponent so I know it's my custom component that's not being recognized somehow. I'd also like to note all the issues that my intel-based iMac is having with RealityComposer, like crashing on particle emitter add and visionOS crashing when I download textures to use. So I'm constantly paranoid that it's a system-level problem. Anyways, thanks in advance for any help. Cheers, Noah
Posted Last updated
.