I have followed the tutorial below to build a Vision Pro app.
https://developer.apple.com/tutorials/develop-in-swift/create-3d-models-in-the-shared-space
When I run the app in Vision Pro Simulator, the models are much darker than the sample in the last picture of the above link.
How to fix the problem? Do I need to create lights to illuminate the model? It was not mentioned in the tutorial.
Reality Composer Pro
RSS for tagLeverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps
Posts under Reality Composer Pro tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Ok, I am loading an object from a Reality Composer Pro scene that has two entities inside its hierarchy that both have a Physics Body and a Collision component like this
Root
Outer Box Mesh
Hinge + physics(static/kinematic) + collision
Door. + physics(dynamic) + collision
I tried to keep the physics/collision components only to the hinge and the door while I move the root or the outer box via code around. The behaviour I see is that it either
moves the hinge and the door around relative to the top level (despite me checking the movement locking) OR
starts rotating! the root or outer box even though I only set its position.
What is the correct setup in this case? What I want is that I can move the whole object around and settle it somewhere and still have the door pinned at a fixed relative position and have one degree freedom on the hinge axis.
I know how to do it in code but I really want to use the build in Reality Composer Pro settings/components. I am using the latest beta 4.
I used such a gesture under a reality view.
DragGesture().targetedToAnyEntity()
.onChanged { value in
print("DragGesture")
self.dragOffset = value.translation
self.startTimer()
}
.onEnded { _ in
self.dragOffset = .zero
self.direction = "None"
self.stopTimer()
}
However, due to the special nature of Reality View, it is impossible to detect gestures normally, so I think some modifiers should be added after value.translation, but I don't know what modifiers are. Can you give me some? Do you know? Thank you.
Hello all,
I'm developing an application for visionOS and I'm trying to implement 2 different animations:
First animation
Initially, I have a map that should not be visible. I would like to create an animation effect where it appears as if a drop of water falls in the center of the map and the expanding waves gradually reveal the entire map.
Is there a way to do it directly on SwiftUI or I need an animation on my USDZ?
Second animation
I want an animation effect similar to a cinema screen opening from the center, gradually revealing a video that was initially hidden.
Is there a way to do it directly on SwiftUI?
Can someone help me with this topic?
Thanks ;)
How should I set the window of WindowGrop to resemble a curved screen style?
In a listing program in WWDC24, it shows that users can control the robot to walk by pinching and sliding. However, I haven't found any documents or videos related to this function. If you know, please let me know. Thank you!
Hello. I am a designer developing a Vision Pro app. I have Two Problem in my App Develop Process.
I am trying to import free 3D national heritage content from Korea into Reality Composer Pro and place it in the app's internal space. However, there is an issue where the textures are not being imported correctly.
in Reality Composer Pro
in Simulator
In Reality Composer Pro, the textures are displayed correctly, but when I run the app on the Simulator in Xcode, the textures appear white and are not displayed properly. The content I imported is an .obj file, and I applied all the textures in jpg format using Reality Converter and exported it as a .usdz file, but the same issue persists.
I checked to see if the problem only occurs on the Simulator, but the same issue occurs on the Vision Pro device as well. How can I resolve this problem?
The following error code appears in Xcode, and the simulator does not run. I think it might be due to the size of the object added to the scene, so I tried compressing it with Reality Converter, but the issue still persists. Is there any other way to resolve this?
[MTLDebugDevice newBufferWithBytesNoCopy:length:options:deallocator:]:700: failed assertion Buffer Validation
newBufferWith*:length 0x280cc000 must not exceed 256 MB.
When I run my visionOS App, RealityKitContent Report an error:
Tool terminated by signal 'Segmentation fault: 11'
And it points to a USDZ model I imported, but in the scene, my model can be displayed normally and there is no damage. Why does an error occur? How can I check and repair it?
Hey there,
I'm wonder if any one knows how to make the shader graph which is shown in wwdc24 video. I tired couple things but couldn't get same result I could make it in Unity and Blender but not in RCP.
thank you.
https://developer.apple.com/wwdc24/10106
On visionOS 2 beta 3, Reality Composer Pro will open a cached copy of a scene (for example an usdc file I just changed) on the first try. Closing it and re-opening it will open the correct version.
Am I doing something wrong?
I am having a difficult time to create particle systems in Reality Composer Pro (visionOS beta 3). They tend to start to flicker and all particles disappear and reappear in semi-random intervals.
I can clearly see that happening with one effect that I put inside a small box consisting of 4 transparent walls that has a solid floor. When I change the view angle the particle system starts to flicker when viewed from below its emission height.
I tried all combinations of particle rendering: billboard->free, additive etc and it does not change anything. I am using the default particle image.
Any help appreciated
Given that one can add custom components and expose them via RCP, how do I go about implementing my components / system in a way where when I make a parameter change that gets applied to the entitiy in the RCP viewport?
I have an Entity exported from Blender, after loaded from RealityView, the "Body" and "Mesh" Entity have no ModelComponent, but they have Material Bindings reference, how can I update their materials?
I have read this thread to send notification to play animations in RCP.
If I now want to pause and come back later or stop and reset the timeline, is there a way to do so?
I am using Model3D to display an RCP scene/model in my UI.
How can I get to the entities so I can set material properties to adjust the appearance?
I looked at interfaces for Model3D and ResolvedModel3D and could not find a way to get access to the RCP scene or RealityKit entity.
I can‘t Figure Out How to Get My Earth Entity to Rotate on its Axis. This is a follow up post from a previous Apple Developer forum post.
How would I have the earth (parent) entity rotate CCW underneath the orbiting starship child?
I tried adding the following code block to the RealityView but it is not working:
if let rotatingEarth = starshipEntity.findEntity(named: "Earth") {
rotatingEarth.transform.rotation = simd_quatf.init(angle: 360, axis: SIMD3(x: 0, y: 1, z: 0))
if let animation = try? AnimationResource.generate(with: rotatingEarth as! AnimationDefinition) {
rotatingEarth.playAnimation(animation)
}
}
Any advice on getting the earth to rotate?
I tried reviewing the Hello World WWDC23 project code, but I was unable to understand the complexity and how that sample project got the earth to rotate.
i want to do this for visionOS 1.2. I realize there are some new animation and possible other capabilities coming up in vision 2.0 but I want to try to address this issue in the current released visionOS version.
When I download and install Xcode-beta 16 I get the following error with the app not opening. Xcode v15.4 opens fine on an Apple M2 Max running Sonoma 14.5.
"Loading a plug-in failed. The plug-in or one of its prerequisite plug-ins may be missing or damaged and may need to be reinstalled.".
Anyone know how to resolve? I am trying to experience Timelines in Reality Composer Pro and need 16 beta+ to try it.
For years, the preliminary behaviours provided a way to trigger an action sequence (now called timeline) when the user came close to an object.
I could not find the same in the new RealityComposerPro.
I was trying to load an Entity by Entity(named: sceneName, in: realityKitContentBundle), which works for many of my .usda file. But this time I got an error:
Error loading asset from scene PinballTable.usda: failed to load '7058602595919186152 Scene (RealityFileAsset)Bundle/RealityKitContent-RealityKitContent-resources/RealityKitContent.reality/Scene_14.compiledscene' (Asset provider load failed: type 'RealityFileAsset' -- Failed to load compiled data for asset path 'Scene_14.compiledscene', due to error: Failed to deserialize asset data.)
Any ideas on why this won't work? I have checked the size of my .usda file it's around 42kb so I won't think it's sake of file size. Due to many .usda reference inside of this scene, I suspect that it might be the case the bundle cannot locate other usda reference. So I export the whole scene into .usdz file and it turns to 118kb. Wonder if this could be the only issue here that affect the loading result but this is what I have tried so far.
visionOS System: visionOS beta 2/simulator 1.1 (neither works)
XCode: 15.4/16.0 beta (neither works)
Hey, I need help achieving realistic fog and clouds for immersive spaces. Making 3D planes with transparent fog/cloud textures work, but they create issues when there are a lot of them overlapping each other. Also I can't get a good result with particles either.
Thanks in advance!