Reality Composer

RSS for tag

Prototype and produce content for AR experiences using Reality Composer.

Posts under Reality Composer tag

45 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Xcode 16 cannot load AR scenes from .rcproject files
Ever since updating to Xcode 16 my AR app doesn't compile, because Xcode doesn't recognize the .rcproject files used to load the AR experiences in iOS app. The .rcproject files were authored in Reality Composer on iPadOS. The expected behavior is described in this official Apple documentation article: https://developer.apple.com/documentation/realitykit/loading-entities-from-a-file How do I submit a ticket to Apple?
0
0
22
2h
Xcode 16.2 Beta fails to build visionOS app RealityContent assets
I've just updated to macOS 15.2 Beta and Xcode 16.2 Beta and just noticed I can't build my visionOS app as before. I have a separate /Volumes/Development APFS volume for Xcode projects, plus utilising a Relative setting for DerivedData so that these are located in the projects' directories – in this case, subfolder on this volume mentioned. I just found that when attempting to build the app in either Xcode 16.1 or 16.2 Beta with visionOS 2.1 SDK, I'm getting a following error: Finished processSwiftFiles() with a failure 'You don’t have permission to save the file “CustomComponentUSDInitializers.usda” in the folder “RealityAssetsGenerated”.' Sandbox: realitytool(6216) deny(1) file-write-create /Volumes/Development/travel-visionos/DerivedData/Tripomatic/Build/Intermediates.noindex/RealityKitContent.build/Debug-xrsimulator/RealityKitContent_RealityKitContent.build/DerivedSources/RealityAssetsGenerated/CustomComponentUSDInitializers.usda.sb-02874eb3-PS4qfZ Failure creating schema - 'You don’t have permission to save the file “CustomComponentUSDInitializers.usda” in the folder “RealityAssetsGenerated”.' This for sure worked previously with 15.0/16.0 or 15.1/16.1 as I've even published the app to the App Store. It seems to be some shenanigans related to sandboxing & permissions for Xcode tools which I can hardly work around. I have Xcode added to Developer Tools and Full disk access sections in Privacy & Security, yet that doesn't really change anything in relation to this issue. When I move the project to a subfolder in my home directory, the build succeeds. The same applies to switching DerivedData setting to the default value to that DD are generated in ~/Library/Developer folder for all projects opened in Xcode. Thus, either macOS or Xcode now has an issue with running realitytool to generate files on my dedicated volume. Should I consider this Xcode or macOS Beta issue? Report it via Feedback Assistant maybe?
0
0
151
2w
Reality Composer Project and Xcode 16
After upgrading to Xcode 16 my app, which utilizes imported project files from my iPad's Reality Composer app, now has two issues that I have found so far. I am using an ARView as a UIViewRepresentable with SwiftUI. (Prior to upgading to Xcode 16 everything worked well.) First, there are now several duplicate rcp_export.usdz resources in the "Copy Bundle Resources" build phase section. Even though each file is in a separate folder with a unique UUID, it was causing a compile error saying there are duplicate files. I was able to open the RC project folder and delete the older rcp_project versions which now allows the app to compile. I mention it as it may or may not be related to the second issue. Second, Xcode isn't generating the project code for rcproject, so when I call the RCProject.loadSceneAsync function I am getting an error that says "Cannot find 'RCProject' in scope"
3
5
345
Oct ’24
How to perform real-time specular reflection in visionOS
In RealityKit, I know that an HDR image is pre-calculated, and through the settings of the ImageBasedLight Component, a specified specular object can reflect the content of the HDR image. If a mirror object is originally very large, such as a large-area continuous glass door, after specifying an IBL image for these glass doors, the image reflected by the mirror will be obviously deformed when it moves in space. Because IBL is a picture of the surrounding environment at a point, while the glass door is a surface. Is there a truly real-time specular reflection calculation setup in RealityKit that can reflect the model on the opposite side of the glass door?
0
0
322
Sep ’24
Build Errors for Reality Composer Pro Packages in Xcode 16 Beta 6 for iOS 18
Using Xcode 15.4, I have successfully built and run my app using Reality Composer Pro Version 1.0 package. I then successfully submitted that app version for release. Now, using Xcode 16 Beta 6, I've created a new branch repository for updating my app for iOS/iPadOS 18 and visionOS 2. However, once I created and switched to the new branch and did a build, I get build errors. It seems to be regarding the package manifest that relates to my Reality Composer Pro package that is part of my app. When I go to the package file in my project navigator and click the Open in Reality Composer Pro button, my package opens in Reality Composer Pro 2.0, which makes sense since it it the version for Xcode 16. However, I don't know how to address/get rid of the build errors. I've added and image of my build errors.
5
0
507
Sep ’24
Composing Interactive 3d Content example Build Failure
Hello, I downloaded the most recent Xcode 16.0 beta 6 along with the example project located here Currently I am experiencing the following build failures: RealityAssetsCompile ... error: [xrsimulator] Component Compatibility: BlendShapeWeights not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: EnvironmentLightingConfiguration not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Component Compatibility: AudioLibrary not available for 'xros 1.0', please update 'platforms' array in Package.swift error: [xrsimulator] Exception thrown during compile: compileFailedBecause(reason: "compatibility faults") error: Tool exited with code 1 I saw that there is a similar issue reported. As a test I downloaded that project compiled as expected.
2
0
403
Aug ’24
Is there a way to make an .objcap file from a .USDZ file
I have design a 3D object and exported it as a USDZ. I also 3D printed said object. I want to use the object as a 3D trigger for an AR experience I am building. My question is: is there a process that would let me take the 3D .usdz file and convert it to a .arobject or a .objcap medium/low density point cloud to use as an AR trigger. Because I do have the 3D print of the object I did use the "scan" option when setting up my scene but the "resolution"/fidelity seems really low, and the results I get are just mediocre. I would love to take my 3D USDZ that I already have and use it to generate a file that can be used as a 3D trigger. is this possible, or is there a process to do this. I am able to take the 3D that I scan in Reality Composer (which is exported as a .objcap file), send it to reality converter on my Mac and make a USDZ from it. I am looking for a way to go the other way .USDZ > .objcap or .arobject. I am trying to make a experience that mimic projection mapping but in AR. I have a 3D object I built and textured in substance painter. I also printed this object in a base gray color. I want to use the 3D print of the object as an AR trigger that would start a scene placing/overlaying/projection mapping the textured 3D model over the gray 3D printed model. Ideally the mapped 3D model would be spatial attached to the 3D print and move with it when the object is handled.
1
0
467
Aug ’24
RealityKit scene with the Entity Component System
I'm following WWDC for interactive 3D content in reality composer pro and apple's document https://developer.apple.com/wwdc24/10102 https://developer.apple.com/documentation/realitykit/implementing-systems-for-entities-in-a-scene#Retrieve-entities-with-an-entity-query However, this simple code to declare a dummy Component and System has compile error /Users/Workspaces/repository/Packages/RealityKitContent/Sources/RealityKitContent/RobotComponent.swift:18:24 Static property 'query' is not concurrency-safe because non-'Sendable' type 'EntityQuery' may have shared mutable state // Define a query to return all entities with a MyComponent. private static let query = EntityQuery(where: .has(MyComponent.self)) // Initializer is required. Use an empty implementation if there's no setup needed. required init(scene: Scene) { } // Iterate through all entities containing a MyComponent. func update(context: SceneUpdateContext) { for entity in context.entities( matching: Self.query, updatingSystemWhen: .rendering ) { // Make per-update changes to each entity here. } } } I'm using XCode beta3 and project target visionos 2
1
0
458
Jul ’24
Can’t Figure Out How to Get My Earth Entity to Rotate on its Axis
I can‘t Figure Out How to Get My Earth Entity to Rotate on its Axis. This is a follow up post from a previous Apple Developer forum post. How would I have the earth (parent) entity rotate CCW underneath the orbiting starship child? I tried adding the following code block to the RealityView but it is not working: if let rotatingEarth = starshipEntity.findEntity(named: "Earth") { rotatingEarth.transform.rotation = simd_quatf.init(angle: 360, axis: SIMD3(x: 0, y: 1, z: 0)) if let animation = try? AnimationResource.generate(with: rotatingEarth as! AnimationDefinition) { rotatingEarth.playAnimation(animation) } } Any advice on getting the earth to rotate? I tried reviewing the Hello World WWDC23 project code, but I was unable to understand the complexity and how that sample project got the earth to rotate. i want to do this for visionOS 1.2. I realize there are some new animation and possible other capabilities coming up in vision 2.0 but I want to try to address this issue in the current released visionOS version.
5
0
664
Jul ’24
Cloud Service for Apple Vision Pro App
For all the AVP devs out there, what cloud service are you using to load content in your app that has extremely low latency? I tried using CloudKit and it did not work well at all. Latency was super bad :/ Firebase looks like the most promising at this point?? Wish Apple would create an ultra low latency cloud service for streaming high quality content such as USDZ files and scenes made in Reality Composer Pro.
1
0
508
Jul ’24
3D Object Capture not working on iphone 12 pro
The 3D object capture feature doesn’t seem to work on my iphone 12 pro. The circle that is supposed to show up when you begin to begin to move around the object doesnt show up so object capture doesn’t even begin. It says ‘more light..’ or ‘move closer’ but this doesnt happen on my iphone 14 pro. Works perfectly fine on that even with the same lighting. How can this be fixed?
1
0
516
Jul ’24
Multi-platform app for visionOS and iOS: How to include 3D models for both?
I created an app for visionOS, using Reality Composer Pro. Now I want to turn this app into a multi-platform app for iOS as well. RCP files are not supported on iOS, however. So I tried to use the "old" Reality Composer instead, but that doesn't seem to work either. Xcode 15 does not include it anymore, and I read online that files created with Xcode 14's Reality Composer cannot be included in Xcode 15 files. Also, Xcode 14 does not run on my M3 Mac with Sonoma. That's a bummer. What is the recommended way to include 3D content in apps that support visionOS AND iOS?! (I also read that a solution might be using USDZ for both. But how would that workflow look like? Are there samples out there that support both platforms? Please note that I want to setup the anchors myself, using code. I just need the composing tool to the create 3D content that will be placed on these anchors.)
1
0
700
Jun ’24
Object Capture to USDZ Scaling
Im trying to take an object capture, and scale it. What I did so far is create a Reality Composer project, insert the .objcap file into the project, and then scaled it from 100%, to 200%. I then extracted it as a USDZ. it just won't show up in the Xcode preview now, and im not sure why it doesn't show. Is there any way to fix this? im going crazy trying to find a fix for this to work.
1
0
806
Apr ’24