Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics

Post

Replies

Boosts

Views

Activity

Manipulate objects in Reality Composer Pro
The documentation at https://developer.apple.com/documentation/visionos/designing-realitykit-content-with-reality-composer-pro states: Reality Composer Pro treats your imported assets as read-only. This is a huge obstacle for me, as I need to do multiple adjustments to the scene. I somehow managed to actually import one of my assets into the scene and can manipulate it directly, but now I can't figure out how I did this. As I have to prepare further assets and would like to do this directly in Reality Composer Pro, I'm looking for a way to actually load them into the scene. Any idea how this can be done?
5
0
320
Oct ’24
Is dismissWindow actually asynchronous?
On visionOS, I have discovered that if dismissWindow is followed immediately by a call to openWindow, the new window does not open where the user is looking at. Instead, it appears at the same location as the dismissed window. However, if I open the new window after a small delay, or after UIScene's willDeactivateNotification, the new window correctly opens in front of the user. (I tested this within a opened immersive space.) Does this imply that dismissWindow is actually asynchronous, in the sense that it requires extra time to reset certain internal states before the next openWindow can be called? What is the best practice to close a window, then open a new window in front of the user's current head position?
0
0
204
Oct ’24
How to set the AttractionCenter for a ParticleEmitterComponent in a System with real time updates
I am trying to achieve an effect such that the particles of a particle system are attracted to my hand entity. The hand entity is essentially an AnchorEntity that is tracking my right hand. let particleEmitterEntities = context.entities(matching: particleEmitterQuery, updatingSystemWhen: .rendering) for particleEmitterEntity in particleEmitterEntities { if var particleEmitter = particleEmitterEntity.components[ParticleEmitterComponent.self] { particleEmitter.mainEmitter.attractionCenter = rightHandEntity.position(relativeTo: nil) // trying to get the world space position of the hand // I also tried relative to particleEmitterEntity particleEmitterEntity.components[ParticleEmitterComponent.self] = particleEmitter } else { fatalError("Cannot find particle emitter") } } The particle attraction center doesn't seem to update Another issue I am noticing here that My particle system doesn't show the particle image a lot of times and just renders a placeholder square when I do this, when I comment this code out I get the right particle image. I believe this is due to the number of times this loop runs to update the position of the attraction center. What is the right way to do an effect where the particles are attracted to my hand.
3
0
269
Oct ’24
how to add entities to a volume or immersive view programmatically?
I have created two scenes, one immersive and one volumetric using Reality Composer Pro. In my test app I can view both and they render correctly. However, I would like to add entities programmatically. I am trying this; var body: some View { RealityView { content in if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle) { viewModel.rootEntity = scene content.add(scene) var anchorEntity = AnchorEntity(world: [0, 0, -0.5]) let sphere = MeshResource.generateSphere(radius: 2.0) let material = SimpleMaterial(color: .red, roughness: 0.5, isMetallic: true) let modelEntity = ModelEntity(mesh: sphere, materials: [material]) anchorEntity.addChild(modelEntity) content.add(anchorEntity) } } } However, the sphere does not appear in the volume. I also tried it in the immersive space and it does not appear there either. What am I missing?
1
0
227
Oct ’24
Tips for developing shader graphs in Reality Composer Pro
I am new to the graph editor and was able to achieve some results. However, I am noticing that my graphs are getting very tangled, confusing, and hard to debug. I was wondering whether: is it possible to define variables, to store the value of computations, and refer to them in other parts of the graph, without having to link them graphically? This would help in tidying the tangled mess I created. In the "Explore materials in Reality Composer Pro" video, I saw that it is possible to create "instances", but I am not sure if that is what I need. For example: does the shader compiler optimize them, so that there is no need to recompute each instance? Is there any functionality to debug the graph, trying inputs and seeing what the numeric outputs would be?
1
0
314
Oct ’24
Quick Look viewer (3D/AR) breaking in-app
We have a native iOS app that supports the upload and display of USDZ files. It has been working great since beta (late 2022) and live launch (late 2023) until now. But recently we had reports from some users on Max model phones (14 + 15 Pro Max) at least. When they launch tap and launch a 3D file the Quick Look player is triggered. So far so good. But for affected users the controls along the top of the player - X (close) AR | Object (toggle) and share button - are moving too high up the phone screen and getting stuck (untappable) behind the phone's top status bar (time, camera bug, connection, battery). This means that when they open a USDZ file in AR or 3D view they have to hard-close the app to get out of it again. This doesn't happen when they open a USDZ file from Files, Dropbox etc on their phone (which also uses the Quick Look player). The controls only move up and get stuck when launching a USDZ from within our app. I'm at a loss to figure out what might be causing this on some phones and not all others. And why only when opening a USDZ file from our app! So far we have replicated this issue on a single iphone 14 Pro Max and a 15Pro Max, both running iOS18+ We have tested on other 15Pro Max's on same OS, and Pros, normal iPhones, Minis and are not experiencing the issue. You would think that a USDZ file is a USDZ file and that your iPhone knows what to do with it and open it in the Quick Look player regardless of where you open the file from. Why would the navigation items be moving if you open the USDZ file from within our app, and why only for some select users? We will continue to troubleshoot and test but I wanted to throw this out to the community in case anyone had experienced this or if anyone had any theories that would expedite our testing. Your thoughts are most appeciated! Here is a video showing the expected (correct) behaviour: https://www.dropbox.com/scl/fi/0sp8s4opaf2m4gukkcbrk/How-opening-a-USDZ-should-behave_correct-behaviour.MP4?rlkey=tzzau9x91mwox66gsgguryhep&st=qiykmne9&dl=0 and a screenSHOT attached below of what is happening on one of the affected user's iPhone 15Pro Max.
1
0
302
Oct ’24
USDZ with Blend Shapes Workflow Recommendations
Hi, since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ. Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm) So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
2
0
407
Oct ’24
DeactivatedEntities in scene not found with entity.findEntity(named:)?
I have a rkasset package, from which I load my scene. In the scene, I'm using entity.findEntity(named:"..") to find entities to activate/deactivate. When I have entities deactivated in the *.usda, they are not found with this method. Further inspection shows that the deactivated entities seem not to be compiled into the build. Is there anything I can set that prevents skipping the build for these deactivated entities?
1
0
187
Oct ’24
Creating and applying a shader to change an entity's rendering in RealityKit
Hello, I am looking to create a shader to update an entity's rendering. As a basic example say I want to recolour an entity, but leave its original textures showing through: I understand with VisionOS I need to use Reality Composer Pro to create the shader, but I'm lost as how to reference the original colour that I'm trying to update in the node graph. All my attempts appear to completely override the textures in the entity (and its sub-entities) that I want to impact. Also the tutorials / examples I've looked at appear to create materials, not add an effect on top of existing materials. Any hints or pointers? Assuming this is possible, I've been trying to load the material in code, and apply to an entity. But do I need to do this to all child entities, or just the topmost? do { let entity = MyAssets.createModelEntity(.plane) // Loads from bundle and performs config let material = try await ShaderGraphMaterial(named: "/Root/TestMaterial", from: "Test", in: realityKitContentBundle) entity.applyToChildren { $0.components[ModelComponent.self]?.materials = [material] } root.addChild(entity) } catch { fatalError(error.localizedDescription) }
3
0
334
Oct ’24
RealityKit Subdivide
In the Discover RealityKit APIs for iOS, macOS, and visionOS presentation, there was a slide at the end highlighting new features not covered in the video. One of them was surface subdivision, but I have not been able to find any documentation or APIs that support this feature. Does anyone have any further details or how this works in RealityKit?
4
1
614
Jul ’24
Make subdivision surfaces work in Reality Composer Pro
Information is light on the new subdivision support for USD models in RealityKit, and I have been unable so far to get one of my models to actually subdivide within Reality Composer Pro or Quick Look (or when viewing on Vision Pro). I've exported a few test models from Houdini and verified that it contains ' uniform token subdivisionScheme = "catmullClark"'. I've started with some very lightweight, basic meshes. But, when viewing, they simply look like polygonal meshes. There's no 'subdividing' occurring at runtime when viewing the models. Is there a trick to getting them to actually smooth-out?
5
0
305
Oct ’24
Understanding the Distance for Physical Object Visibility in Vision Pro Immersive System
I understand that the system helps maintain user comfort by automatically adjusting the opacity of content in certain situations, like when someone moves too quickly or gets too close to a physical object. The content in front of them dims briefly to allow a clearer view of their surroundings. And I'd like to know the specific distance at which the system begins to show the physical object, or what criteria are used for this adjustment.
0
0
189
Oct ’24
visionOS pushWindow being dismissed on app foreground
We seen to have found an issue when using the pushWindow action on visionOS. The issue occurs if the app is backgrounded then reopened by selecting the apps icon on the home screen. Any window that is opened via the pushWindow action is then dismissed. We've been able to replicate the issue in a small sample project. Replication steps Open app Open window via the push action Press the digital crown On the home screen select the apps icon again The pushed window will now be dismissed. There is a sample project linked here that shows off the issue, including a video of the bug in progress
0
1
164
Oct ’24
How to use CubeMap in Reality Composer Pro?
I just do as the document in https://developer.apple.com/documentation/shadergraph/realitykit/cube-image-(realitykit) I have a .ktx file, and use CubeImage node to load it, then Convert node, but it shows black. I check it on my Vision Pro, it's still black, I don't know why? Is it something wrong? ps: I also use Image node to load .ktx file, it shows one image, so I belive .ktx file is right. I alse checked it on Vision Pro.
2
0
301
Oct ’24