Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

How to trigger actions by OnCollision in Behaviors Component
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object). Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code). Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now. I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
0
0
335
Aug ’24
Hand Anchor Error
The entity in My RealityView contains tracking components and allows them to track different places of the hand. However, I found that except for the fingertip of the index finger, the fingertip of the thumb, the palm and the wrist, all other positions cannot be tracked normally (such as the fingertip of the middle finger). How can I solve it (I think it may be a beta version of the bug)
1
0
359
Aug ’24
Can't Tap
I am attempting to execute actions after clicking an entity in Reality View using the Behaviors component. I have added the Input Target component and the Tap gesture as follows: TapGesture().targetedToAnyEntity() .onEnded({ value in _ = value.entity.applyTapForBehaviors() }) ) However, during testing, I have observed that the entity does not appear to recognize the click gesture. Could you kindly provide any relevant documentation or guidance on this matter?
1
1
382
Aug ’24
Reality Composer Pro 2.0 shader graphs can't be loaded on visionOS 1
Using Reality Composer Pro 2.0, I created a simple shader graph that displays a texture on an unlit surface: On visionOS 2 beta, I can successfully use ShaderGraphMaterial(named:from:in:) to load that shader graph material and assign it to a model entity. However, on visionOS 1.2 and earlier, either in Simulator or on the device, ShaderGraphMaterial(named:from:in:) fails and I see the following logged to the console: If, using Reality Composer Pro 1.0, I experimentally open the same project and delete and recreate exactly the same nodes above, then ShaderGraphMaterial(named:from:in:) works as expected on visionOS 1.2. Is it a known issue that Reality Composer 2 can't be used with visionOS 1? Is this intentional behavior? I've submitted feedback as FB14828873, including a sample project and repro steps. If possible, I would appreciate guidance from an Apple engineer, like "This is a known issue for [list of node types]" or "Reality Composer Pro 2 is not supported for visionOS 1 development, please refer to [documentation]" or "We recommend [workaround]." Thank you.
6
0
812
Oct ’24
VisionOS system Boundary
Hello I want to ask help from VisionOS devs inside Apple, if it is possible to extend or disable(toggle) the Play Space boundary which is 1.5 meter or 10 feets right now, it is really a shame with such great display and computing power we can't run any room scale VR, I'm currently working on a Undergrad Thesis which choose to use the AVP but I didn't know about this boundary until I've build my room in Unity and put onto my device, is it possible to cut us some slacks regarding the boundary? much thanks
3
0
379
Aug ’24
Getting build errors for my Package.swift Swift Package Manifest
Using Xcode 15.4 and macOS Sonoma 14.6.1 and Reality Composer Pro 1.0. Lately (I believe since updating to Xcode 15.4, not sure..), I'm getting build errors that I didn't get before. The error is saying that "the package manifest at /Package.swift cannot be accessed (/Package.swift doesn't exist in file system)". When I click for more information and bring up the Resolve Packages - Log, it gives a red X error "Resolving Package Graph Failed" I'm not sure why I'm recently getting this error. My Reality Composer Pro Package is part of my project and listed in my project navigator as it has been. The Package.swift file is in the same file location as it has always been and I believe it is correct. I tried removing the package from my project navigator then dragging and dropping it back into the project navigator, but I'm still getting the error. Note: this error is only appearing on a working branch for a new version I'm working on. The error does not appear on my production (main) branch even though it is the same package and location in Finder and appears the same way in the main branch project navigator. I get these same errors on Xcode 16 Beta 5.
4
0
591
Aug ’24
How can I create a laser shader?
I am trying to make a shader that resembles a laser like this: I've been experimenting with a basic Fresnel shader to start, but the Fresnel shader has a problem at high viewing angles where the top has a very different color than the rest of the capsule. This might work for a laser shader once inverted and fine tuned: However, when viewed from the top, it doesn't look so good anymore: Ideally, the purple edge is always ONLY on the edge, and the rest of the surface is the light pink color, no matter the viewing angle. How can I accomplish this to create something that looks like a laser?
0
0
400
Aug ’24
Collision Detection after Object Tracking
So I am tracking 2 objects in my scene, and spawning a tiny arrow on each of the objects (this part is working as intended). Inside my scene I have added Collision Components and Physics Body Components to each of the arrows. I want to detect when when a collision occurs between the 2 arrow entities.. I have made the collision boxes big enough so they should definitely be overlapping, however I am not able to detect when the Collision occurs. This is the code that I use for the scene - import SwiftUI import RealityKit import RealityKitContent struct DualObjectTrackingTest: View { @State private var subscription: EventSubscription? var body: some View { RealityView { content in if let immersiveContentEntity = try? await Entity(named: "SceneFind.usda", in: realityKitContentBundle) { content.add(immersiveContentEntity) print("Collision check started") } } update: { content in if let arrow = content.entities.first?.findEntity(named: "WhiteArrow") as? ModelEntity { let subscription = content.subscribe(to: CollisionEvents.Began.self, on: arrow) { collisionEvent in print("Collision has occured") } } } } } All I see in my console logs is "Collision check started" and then whenever I move the 2 objects really close to each other so as to overlap the collision boxes, I don't see any updates in the logs. Can anyone give me some further guidance/resources on this? Thanks again!
5
0
531
Aug ’24
Button in the attachment not clickable after adding BillboardComponent
I created some attachments by following the Diorama Apple example. Things have been working fine. I wanted to add BillboardComponent to my attachments. So I added it in this way guard let attachmentEntity = attachments.entity(for: component.attachmentTag) else { return } guard attachmentEntity.parent == nil else {return} var billBoard = BillboardComponent() billBoard.rotationAxis = [0,1,0] attachmentEntity.components.set(billBoard) content.add(attachmentEntity) attachmentEntity.setPosition([0.0, 0.5, 0.0], relativeTo: entity) My attachment view is like this Text(name) .matchedGeometryEffect(id: "Name", in: animation) .font(titleFont) Text(description) .font(descriptionFont) Button("Done") { viewModel.arrows.remove(at: 0) } } If I remove the BillboardComponent then button click works fine. but with the `BillboardComponent button click doesn't work (not even highlighting when I look at it) in certain directions. How to resolve this issue?
1
0
301
Aug ’24
Entity Coordinates in Object Tracking
A second post on the same topic, as I feel I may have over complicated the earlier one. I essentially am performing object tracking inside Reality Composer Pro and adding a digital entity to the tracked object. I now want to get the coordinates of this digital entity inside Xcode.. Secondly, can I track more than 1 object inside the same scene? For example if I want to find a spanner and a screwdriver amongst a bunch of tools laid out on the table, and spawn an arrow on top of the spanner and the screwdriver, and then get the coordinates of the arrows that I spawn, how can I go about this?
3
0
408
Aug ’24
Tracked object coordinates in program
Hey, as a follow up to my earlier posts about object tracking on visionOS 2 - I'm doing some experimentation, and my use-case/requirements require me to track the coordinates of some digital entity that I attach (relative to my reference object) to my reference object. Can something like this be done? Right now, all I'm doing is putting my reference object in my scene, and then positioning the 3D content that I want to show at the corresponding locations on the reference object. I am then loading the scene in a RealityView block via my SwiftUI code. I want to know now if I can also extract and use the coordinates of the digital entity that I have placed (post object-tracking), and then make some manipulations via code, for example, if the physical coordinates of the digital entity is in a certain x,y,z range -> trigger this function/bring up this alert message in a tile.. Is something like this possible, and if so, can you help me with understanding different aspects to this problem via code with some sample/reference code? So far I've only done most of the object tracking related tasks via the Reality Composer Pro, but this task that I'm trying to implement will require me to do quite a bit of programming as well, and I'm kinda lost as to how to start and go about this. Thanks for any help that ya'll can give me!
1
0
295
Aug ’24
RealityKit ShaderGraphMaterial parameters in Reality Composer Pro
I have a custom material using Shader Graph in Reality Composer Pro, and I am trying to rig up sliders to values to control the shader. I am able to read the values from the Shader Graph without a problem, and I can even update them when setting them from the LLDB command line and then getting the values back. But the changes are not reflected in the graphics. Is there some sort of update() method or something that is required to read the changed parameter values? On a related note, I am trying to understand what the MaterialParameters.Handle property is and why one would access a MaterialParameter via the handle vs just the name.
1
0
472
Aug ’24
Is it true that without code, Reality Composer Pro cannot make a vertical door that swings with gravity?
I tested all variations. The checkboxes in Reality Composer Pro 2 (beta 4) in the Physics Body component: are absolute and not parent-relative. Also, regardless of what I set the center of mass to: it always rotates around the center of mass despite the local rotation being correctly at the center of origin (imported from Blender). Thus I can get the door to turn but never to swing because it always rotates around its center of mass. Tell me if this is expected behaviour or if there is a simple way to make this work.
8
0
665
Aug ’24
It cannot be looped after enlarging the window.
I have developed a code that initiates the Timeline in the Reality Composer Pro scene every 12.93 seconds. RealityView { … } .onAppear { startTimer() } .onDisappear { stopTimer() } func startTimer() { timer = Timer.scheduledTimer(withTimeInterval: 12.93, repeats: true) { _ in action() } } func stopTimer() { timer?.invalidate() } func action() { print(“SunUpDown”) NotificationCenter.default.post( name: NSNotification.Name(“RealityKit.NotificationTrigger”), object: nil, userInfo: [ “RealityKit.NotificationTrigger.Scene”: scene as Any, “RealityKit.NotificationTrigger.Identifier”: “SunUpDown” ] ) } Upon receiving the “SunUpDown” command, Timeline will be executed. However, everything was functioning normally when I was running the scene, and I could continue looping until I attempted to zoom in on the window and discovered that it ceased looping. Could you please provide an explanation for this behavior? Note: The window type is volumetric, and the parameter of the defaultWorldScaling modifier is dynamic.
1
0
398
Aug ’24
HoverState in RealityView
In a scenario involving one of the entities in a Reality Composer Pro environment, I intend for this entity to display a blue material when viewed by the user. To achieve this, I have added the following Shader Graphs to the materials associated with this entity: Additionally, I have included the HoverEffectComponent component to the Reality View in the code: RealityView { content in if let model = try? await Entity(named: “WorldScene”, in: realityKitContentBundle) { let hoverEffect = HoverEffectComponent(.shader(.default)) model.components.set(hoverEffect) content.add(model) } } However, hover this entity, I am unable to observe any visual reaction. Could you please provide guidance on how to resolve this issue?
1
0
454
Aug ’24
visionOS Move Bug
I was making a gesture to let the goose (character) walk, but I had two problems. 1: I added collision and physical body components to the goose and the collided entity, but I found that those physical formations could not completely block the way of the goose. For example, a tree is in front of it. After the goose is blocked, it will cross the tree or run to the top of the tree as long as it is a little faster. 2: Because the knowledge I have accumulated is not very complete, I can control the movement of the goose on the z-axis. I hope that the user's gestures can be realized by dragging back and forth (z-axis), but I can only realize the user's gestures by dragging up and down (y-axis). I hope you can give me some guidance: GooseOriginalPosition.z + Float(translation.height / 10000) This is the complete code: @State var goose: Entity? @State var isDraggingGoose = false @State var gooseOriginalPosition = SIMD3<Float>(repeating: 0) RealityView { content in if let model = try? await Entity(named: "WorldScene", in: realityKitContentBundle) { content.add(model) } if let gooseEntity = try? await Entity(named: "Goose", in: realityKitContentBundle) { gooseEntity.scale = SIMD3<Float>(repeating: 0.3) content.add(gooseEntity) goose = gooseEntity } } .simultaneousGesture(DragGesture() .targetedToAnyEntity() .onChanged { value in handleDrag(value) } .onEnded { _ in isDraggingGoose = false gooseTimer?.invalidate() }) func handleDrag(_ value: EntityTargetValue<DragGesture.Value>) { guard let goose = goose else { return } if !isDraggingGoose { isDraggingGoose = true gooseOriginalPosition = goose.position(relativeTo: nil) } let translation = value.gestureValue.translation let newPosition = SIMD3<Float>( gooseOriginalPosition.x + Float(translation.width / 10000), gooseOriginalPosition.y, gooseOriginalPosition.z + Float(translation.height / 10000)//I hope the gesture here should be z-axis drag. ) goose.setPosition(newPosition, relativeTo: nil) }
0
0
301
Aug ’24