Reality Composer Pro

RSS for tag

Leverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps

Posts under Reality Composer Pro tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Reality Composer Pro timelines management, but using code
Is it possible to manage the behavior of timeline totally from code? I am exploring the Compose interactive 3D content in Reality Composer Pro sample project after seeing the related video, but the example shows only the use of Behaviors from RCP to activate timelines actions. I was wondering if it is possible to, somehow, retrieve some kind of timeline controller that allows me access to its informations just like the AnimationPlaybackController does with single animations. What I would like to achieve is being able to play/pause/retrieve timestamp from them in order to allow synchronization between different users on SharePlay
1
0
207
3w
How to use CubeMap in Reality Composer Pro?
I just do as the document in https://developer.apple.com/documentation/shadergraph/realitykit/cube-image-(realitykit) I have a .ktx file, and use CubeImage node to load it, then Convert node, but it shows black. I check it on my Vision Pro, it's still black, I don't know why? Is it something wrong? ps: I also use Image node to load .ktx file, it shows one image, so I belive .ktx file is right. I alse checked it on Vision Pro.
2
0
268
Oct ’24
Synchronize 3D object animation inside a volume with SharePlay
I have been experimenting some experiences in which I would like to use SharePlay to allow the app to be used by multiple users. Currently I achieved sharing a volume containing a Reality Composer Pro scene inside of it, the scene contains some entities with an animation. So far I have been able to correctly share the volume and its content, with the animation playing without problems, but once I activate SharePlay different users see different moments of the animation if no animation at all. Is there a way to synchronize animations between all the users, no matter when someone entered the SharePlay session, aside from communicating the animation time once someone joins?
0
0
241
Oct ’24
Remove rcp entity onReceive or by function
This is a follow up question for this post There are several entities I created in rcp that I want to play their animations in a sequence (the order is dynamic) My plan is to have onReceive detecting the notification, remove the current entity from the scene, add the next entity from the queue... Somehow, when I call removeFromParent()/removeChild(), either it is not removed or everything is removed and the app crashes, any tips?
1
0
175
Oct ’24
OnReceive from RCP not functioning
A timeline in RCP will post a notification "Identifier: Completed" when it finishes playing I am trying to receive this in the following way: extension Notification.Name { static let notifyOnAnimationCompleted = Notification.Name("Completed") } // in the view private let AnimationCompleted = NotificationCenter.default.publisher(for: .notifyOnAnimationCompleted) RealityView {...} .onReceive(AnimationCompleted) { _ in print("End") } This was once working back to July, but it never prints "End" for now
1
0
254
Oct ’24
Send messages to the scene
I saw onnoffitacation in the Behavior configuration of Reality Composer pro, which asked me to enter the Nofficatition name, that is to say, this requires swift in Xcode to send a message. There is a message name in the message, so I hope you can write a list for me how to use Swift in Xcode to send a message containing the message name.(There is an answer in https://developer.apple.com/forums/thread/756978, but it doesn't work.) and in the time line in Reality Composer Pro, there is a Notification action, which is used to send messages to swift. How can I ask swift to detect whether the Notification action has sent a message?(There is an answer in https://developer.apple.com/videos/play/wwdc2024/10102/, but it doesn't work.) I have asked this question before (https://developer.apple.com/forums/thread/756978). Those answers were available before, but now they are all invalid in the latest system. I hope you can help me. Thank you.
1
0
357
Oct ’24
Physics
Given my limited knowledge of physics, I would appreciate it if individuals with a solid understanding of the subject could provide insights into this matter. I have added a physical component to a entity in Reality Composer Pro, but I am seeking guidance on how to achieve the following: Make an object float in the air (with a slight downward motion reminiscent of the moon’s surface) Enable the object to move at a slow pace Implement a strong rebound force I would be grateful if you could provide appropriate values for these parameters. Thank you for your assistance.
1
0
334
Sep ’24
Collisions are not detected if the entity is a child of a hand AnchorEntity
I have a created an AnchorEntity for my index finger tip and then created a model entity (A sphere) as a child of it. This model entity has a collision component and a physics body component. I tried using dynamic and kinematic modes for the physics body component. I have created a plane from a cube that has collision component and a static physics body. I have subscribed to the CollisionEvents.Began on this plane. I have also stored it in a EventSubscription state variable. @State private var collisionSubscription: EventSubscription? The I subscribed as follows collisionSubscription = content.subscribe(to: CollisionEvents.Began.self, on: self.boxTopCollision, { collisionEvent in print("something collided with the box top") }) The collision event fires when I directly put the sphere above the plane and let gravity do the collision, but when the the sphere is the child of the anchor entity, the collision events don't happen. I tried adding collision and physics body component directly to the anchor entity and that doesn't work too. I created another sphere with a physics body and a collision component and input target component and manipulate it with a drag gesture. When the manipulation is happening and collide the plane and the sphere the events don't happen when my sphere is touching the plane, but when the gesture end and the sphere is in contact with the plane, the event gets fired. I am confused as to why this is happening. All I want to do is have a collider on my finger tip and want to detect the collision with this plane. How can I make this work? Is there some unstated rule somewhere as where a physics body is manipulated manually it cannot trigger collision events? For more context. I am using SpatialTrackingSession with the tracking configuration of .hand. I am successfully able to track the finger tip.
3
0
315
Sep ’24
Vision Pro app crashes when scene loads in ImmersiveSpace
Hello, I am getting following error on console and my app crashes. It goes to dark and then Apple logo appears and app crashes apply fence tx failed (client=0x61dbbfd7) [0xfffffecc (ipc/mig) server died] [C:3] Error received: Connection interrupted. Failed to commit transaction (client=0x94097449) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xe9684b50) [0x10000003 (ipc/send) invalid destination port] [C:3-1] Error received: Connection interrupted. Failed to commit transaction (client=0xbcac17e9) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0x52392119) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xff841d17) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xdef5c915) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xefdc8bf3) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xd50c1eff) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0x15690a46) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xf296f56b) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port] apply fence tx failed (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port] nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_read_request_report [C1] Receive failed with error "No message available on STREAM" nw_protocol_socket_reset_linger [C1:2] setsockopt SO_LINGER failed [22: Invalid argument] apply fence tx failed (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port] Failed to set override status for bind point component member. Failed to set override status for bind point component member. Failed to set override status for bind point component member. Message from debugger: Terminated due to signal 9 Could you please tell me what's the reason and how can I resolve this. When I loads 2,3 times then app works fine from that point onwards. But this happens time to time when debug.
0
0
272
Sep ’24
USDZ Models from RealityKitContentBundle loosing textures (show all black)
In my VisionPro app, I'm facing a problem with loading USDZ models from a RealityKitBundle package, created using Reality Composer Pro. It was working fine until I added more models to the package. As I added more models with large textures in the project, the app started to show them with texturing problems. So, when I load the models from the RealityView using Entity(named:in), the mesh loads correctly, but all black, with no textures, as below: However, when I load the same USDZ directly from the main bundle, using ModelEntity(named:in), it loads fine. I know that large textures can cause memory issues, but when talking about one single model, I know that it's not enough to cause a memory overflow in the VisionPro. This USDZ model is about 40MB with something around 800MB of texture memory (from the RealityComposerPro Statistic tab). I've built experiences in VisionPro with much heavier models, and they do present the same texture issues, but only after there's more than 3 huge models enabled in the Reality scene. But that's not the case. The un-textured model appears right from the beginning, so it seems to me that's not a runtime issue in the device, but rather some issue in the packaging process from RealityComposerPro to XCode to the Device, am I correct? I'm also using a simple Mac Mini with M2 but only 8MB of RAM. Maybe that's the issue? As I still want to use RealityComposerPro to build more dev-friendly and interesting applications, I'd really appreciate some guidance here! Thanks in advance!
1
0
280
Oct ’24
Failed to find and load USDZ from RealityKitContentBundle
I'm developing a VisionOS app and I'm trying to load a ModelEntity from a USDZ file which is inside my custom RealityKit package called R2UVisionOficial. But it keeps giving me an resourceNotFound error. import RealityKit import R2UVisionOficial import ARKit /* more code */ do { let newEntity: Entity //... // Loads entity from USDZ inside package newEntity = try await ModelEntity(named: "Salas", in: r2UVisionOficialBundle) //... return newEntity } catch { print("wtManager >>> **** FAILED to load entity:", error.localizedDescription) throw error } I'm sure I have the Salas.usdz file in the root folder of my package and that I'm using the correct paths. However I keep getting the error: Failed to find resource with name "Salas" in bundle It's funny because when I try to load a USDA (scenes) from the same packages, it works fine. So I guess there's something to do with ModelEntity or USDZ files. Can you please help me? P.S. This issue is similar to https://developer.apple.com/forums/thread/746842?answerId=780415022#780415022
1
0
328
Sep ’24
Is visionpro support facial recognition app?
Hello everyone, I'm a Computer Science student. My supervisor has given me some topics for my final year project, and one of them involves using Vision Pro for facial recognition—specifically, identifying a designated face to display specific information. As a developer, my understanding of Vision Pro is quite limited. I've done some research online and found that Unity and Xcode are used as development tools. Traditionally, facial recognition is done using OpenCV. However, I've come across articles stating that Apple, due to security reasons, cannot implement facial recognition. I’d like to ask if that’s true. Also, with VisionOS 2 featuring object tracking and image tracking, could these methods potentially replace facial recognition?
1
1
288
Sep ’24
Xcode 16 / Reality Composer Pro 2 segmentation fault issue
After installing MacOS Sequoia, Xcode 16 & RealityComposer Pro 2 my Apple Vision Pro projects (worked perfectly with Xcode 15) started to give me Tool terminated by signal 'Segmentation fault: 11' error while compiling RealityKitContent assets. This happens only when I try to build project with .usdz models exported from Blender, but when I try with sample models from apple website it works nice with no errors. Is there any solution?
1
0
348
Sep ’24
Getting vector3 or vector4 per-vertex data into Shader Graph Editor
Hi, I'm trying to understand how I can get 3- or 4-channel per-vertex data into the Graph Editor. From my tests, it seems that: the "Geometric Property" node does not give access to 4-channel data, "Geometric Property (vector3)" does not give me access to custom properties besides the ones defined in MaterialX core the "Texture Coordinates" node has a vector4f mode (yay!), but according to MaterialX spec, texcoords must have 2 or 3 channels, and I can't get 4-channel data to show up there either. My assumption so far is that I must be missing some "magic" – for example, do the primvars in a file have to be in a specific order, independent of their names? Or do their names matter? (E.g. convention would be primars:st and primvars:st1 and so on) Unfortunately the forum doesn't allow me to attach any USDZ or ZIP files or GDrive links; if there's a way to share a test file I'm happy to do so!
0
0
326
Sep ’24
Weird Reality Composer Pro Loop Timeline bug
TLDR: Timeline does not play animation when Repeat Forever is checked. Hi! I have created a timeline for my model that does a built-in emphasize animation. Then I added a behavior to my model and has set OnAddedToScene with action to run that timeline. It works perfect well on my device. But I want the timeline to be looped. I realized that there's no loop option in the timeline, but I noticed that I can loop it if I insert it into another timeline(The loop checkbox shows up). So I did that and had my model's behavior to run that timeline. But then the model doesn't play the animation as intended. Note: I am not making a VisionPro app, but an iOS app leveraging ARKit and RealityKit Environment: iPhone 13 Pro Max with iOS18.0 Code: struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.session.run() Task { do { let anchor = AnchorEntity(plane: .horizontal) let emojiScene = try await Entity(named: "SunglassesScene", in: bubbleAR anchor.addChild(emojiScene) arView.scene.addAnchor(anchor) } catch { print("Failed to load models: \(error)") } } return arView } } Thank you!
0
0
328
Sep ’24
How to display spatial images or videos on swiftUI view
Hi! Now I am making a visionOS program. I have an idea that I want to embed spatial videos or pictures into my UI, but now I have encountered problems and have no way to implement my idea. I have tried the following work: Use AVPlayerViewController to play a spatial video, but it is only display spatial video when modalPresentationStyle =.fullscreen. Once embedded in swiftUI's view, it shows it as a normal 2D image. The method of https://developer.apple.com/forums/thread/733813 I also tried, using a shadergraph to realize the function of the spatial images displaying, but the material can only be attached on the entity, I don't know how to make it show up in view. I also tried to use CAMetalLayer to implement this function and write a custom shader to display spatial images, but I couldn't find a function like unity_StereoEyeIndex in unity to render binocular switching. Does anyone have a good solution to my problem? Thank you!
0
0
353
Sep ’24