Post

Replies

Boosts

Views

Activity

Post Notification to RCP but Timeline won't fire
I am trying to use onNofitication in BehaviorComponent to fire up my composed timeline actions. Which is formed up by one TransformTo action, one Hide action and followed by a Notification action indicating the other two actions are finished. With this post, I successfully send a notification to RCP to fire up my timeline with identification: NotificationCenter.default.post( name: NSNotification.Name("RealityKit.NotificationTrigger"), object: nil, userInfo: [ "RealityKit.NotificationTrigger.Scene": scene, "RealityKit.NotificationTrigger.Identifier": "onSomethingStart" ] ) On the other hand, to subscribe that Notification Action, I append a onReceive function below my RealityView, and succesfully received my notification private let notificationTrigger = NotificationCenter.default.publisher( for: Notification.Name("RealityKit.NotificationTrigger")) guard let entity = out.userInfo?["RealityKit.NotificationTrigger.SourceEntity"] as? Entity, let notificationName = out.userInfo?["RealityKit.NotificationTrigger.Identifier"] as? String else { return } debugPrint("Received notification: \(notificationName), entity name: \(entity.name)") Which means that my Timeline is fired up because I can received my notification in my Timeline. But the rest two actions just don't appear to be working. I played the timeline in RCP it works fine. Anything I missed to make it tick? XCode beta 16.1 VisionOS beta 9
2
0
436
Sep ’24
Having trouble in loading audio file resources from RCP bundle.
RealityContentKit bundle resource issue Recently I always encounter weird loading bugs from RealityKitContent bundle. When I was trying to load audio resource as AudioFileResource or AudioFileGroupResource by loading from *.usda from RealityKitContent bundle, with this method. My code is nothing complicated but simple as below: let primPath: String = "/SampleAudios/SE_bounce_audio" guard let resource = try? AudioFileGroupResource.load(named: primPath, from: "MyScene.usda", in: realityKitContentBundle) else { return } And the runtime program "sometimes"(whenever I change something RCP it somethings work again but the behavior is unpredictable) reports that it "Cannot find MyScene.usda:/SampleAudios/SE_bounce_audio in RealityKitContent.bundle". I put MyScene.usda under the root folder of RealityKitContent package because I found that RealityKit just cannot find any *.usda scene if you didn't put that on the root level (could be a bug because of the way it indexes its files). I even double checked my .usda file with usdview, the primPath is absolutely correct. I think there are some unknown issues when RealityKitContent copy resources and build the package. I tried to play with the package Package.swift file a bit to see if I could manually copy my resources (everything) and let the package carry my resources but it just didn't work. So right now I just keep this file untouched below (just upgrade the swift-tools-version to 6.0 as only that can supports .visionOS(.v2)): // swift-tools-version:6.0 // The swift-tools-version declares the minimum version of Swift required to build this package. import PackageDescription let package = Package( name: "RealityKitContent", platforms: [ .visionOS(.v2) ], products: [ // Products define the executables and libraries a package produces, and make them visible to other packages. .library( name: "RealityKitContent", targets: ["RealityKitContent"]), ], dependencies: [ // Dependencies declare other packages that this package depends on. // .package(url: /* package url */, from: "1.0.0"), ], targets: [ // Targets are the basic building blocks of a package. A target can define a module or a test suite. // Targets can depend on other targets in this package, and on products in packages this package depends on. .target( name: "RealityKitContent" ), ] ) That is just issue one, RealityKitContent package build issue. Audio file format issue Another is about Audio File Format RCP supports. I remember is a place (WWDC?) saying .wav and .mp4 are supported to be used as audio source. But when I try to set up Spatial Audio, I find sometimes *.wav or *.mp3 can also be imported as AudioSourceFile. But the behavior is unpredictable. With two *.wav files SE_ball_hit_01.wav and SE_ball_hit_02.wav, only SE_ball_hit_01.wav is supported, 02 is reported as the format is not supported/ Check out my screenshots to see the details of two files. Two files have almost the same format (same sample rate or channel). I understand there might be different requirements for a source file to be used as Spatial or Ambient audio. But I haven't figured that out or there is nothing I can find helpful on Apple Documentation. So what is the rules? Thanks for reading and any thought is welcomed.
1
0
425
Aug ’24
SwiftUI's alert window won't get automatically focus in visionOS
I have three basic elements in this UI page: View, Alert, Toolbar. I put Toolbar and Alert along with the View, when I click a button on Toolbar, my alert window shows up. Below could be a simple version of my code: @State private var showAlert = false HStack { // ... } .alert(Text("Quit the game?"), isPresented: $showAlert) { MyAlertWindow() } message: { Text("Description text about this alert") } .toolbar { ToolbarItem(placement: .bottomOrnament) { MyToolBarButton(showAlert: $showAlert) } } And in MyToolBarButton I just toggle the binded showAlert variable to try to open/close the alert window. When running on either simulator or device, the bahavior is quite strange. Where when toggle MyToolBarButton the alert window takes like 2-3 seconds to show-up, and all the elements on the alert window is grayed out, behaving like the whole window is losing focus. I have to click the moving control bar below (by dragging gesture) to make the whole window back to focus. And this is not the only issue, I also find MyToolBarButton cannot be pressed to close the alert window (even thogh when I try to click the button on my alert window it closes itself). Oh btw I don't know if this may affect but I open the window with my immersive view opened (though I tested it won't affect anything here) Any idea of what's going on here? XCode 16.1 / visionOS 2 beta 6
1
0
466
Aug ’24
Entity.applyTapForBehaviors() only works on Simulator, not device
I created a simple Timeline animation with only a "Play Audio" action in RCP. Also a Behaviors Component setting an "OnTap" trigger to fire this Timeline animation. In my code, I simply run Entity.applyTapForBehaviors() when something happened. The audio can be normally played on the simulator but cannot be played on the device. Any potential bug leads this behavior? Env below: Simulator Version: visionOS 2.0 (22N5286g) XCode Version: Version 16.0 beta 4 (16A5211f) Device Version: visionOS 2.0 beta (latest)
1
0
404
Aug ’24
How to trigger actions by OnCollision in Behaviors Component
It's all about notifications to trigger actions from RCP's new Timeline system. From Compose interactive 3D content in Reality Composer Pro I am actually starting to confuse why there was need to use Entity.applyTapForBehaviors in code to trigger content in Behaviors Component. Simply because in Behaviors Component, we have chosen OnTap to allow a "Tap Notification" to trigger our action (on a selected target object). Then I guess by selecting OnCollision this trigger, I should write something like CollisionEvent.entityA.applyCollisionForBehaviors, which we don't have. And ofc the collision on my object won't trigger this action (because I only did things in RCP not in code). Ignoring this post has pointed out we could use Behaviors Component's OnNotification to trigger something for now. I found that I could still use OnTap trigger but actually put my code Entity.applyTapForBehaviors under my subscribed collision's begin event. That actually works better than OnCollision So what is the design principles here? And how could I trigger a collision notification to let my Behaviors Component's OnCollision actually works?
0
0
334
Aug ’24
Implementing a bouncing surface
I am trying to simulate a pinball game and I want to use PhysicsBody & PhysicsMotion to achieve that. I tuned the parameters around in PhysicsBodyComponent, but the result is not quite ideal for now. Imagine a fully inflated basketball bouncing high off the ground (ground vs basketball). I assign PhysicsBodyComponent and CollisionComponent to both basketball and the ground. For basket ball, I set it as: dynamic mode mass 1, inertia .one Material.Restitution 1 Angular Damping and Linear Damping to 0 AddForce to make the basketball move to hit the ground For ground, I set it as: static mode mass 1, inertia .zero Material.Restitution 1 Angular Damping and Linear Damping to 0 However, when the basket ball hit the ground, it isn't that bouncy, the basketball behaves like hitting to a cotton and the linear speed just dumps fast. Wonder how I could achieve the bouncing effect like real basketball vs ground.
4
0
923
Jul ’24
Failed loading .usda/.usdz from RealityKitContent package
I was trying to load an Entity by Entity(named: sceneName, in: realityKitContentBundle), which works for many of my .usda file. But this time I got an error: Error loading asset from scene PinballTable.usda: failed to load '7058602595919186152 Scene (RealityFileAsset)Bundle/RealityKitContent-RealityKitContent-resources/RealityKitContent.reality/Scene_14.compiledscene' (Asset provider load failed: type 'RealityFileAsset' -- Failed to load compiled data for asset path 'Scene_14.compiledscene', due to error: Failed to deserialize asset data.) Any ideas on why this won't work? I have checked the size of my .usda file it's around 42kb so I won't think it's sake of file size. Due to many .usda reference inside of this scene, I suspect that it might be the case the bundle cannot locate other usda reference. So I export the whole scene into .usdz file and it turns to 118kb. Wonder if this could be the only issue here that affect the loading result but this is what I have tried so far. visionOS System: visionOS beta 2/simulator 1.1 (neither works) XCode: 15.4/16.0 beta (neither works)
1
0
595
Jul ’24
coordinate system in shareplay + spatial persona experience
I cannot figure out how shareplay + spatial persona place the origin of RealityKit's coordinate system I have an App on visionOS with immersiveSpace(.mix) scene. In the scene I am using ARKit to track my hand, creating a virtual object following the movement of my hand palm. Every frame I query positions from the HandAnchor to update the position of my object, using originFromAnchorTransform to correctly place my object in the scene. However when I try to adopt that into a shareplay experience with spatial persona, the virtual object's position update becomes a mess. With different template (.sidebyside or .conversational), the origin in my space appears with no pattern. I can always see that the virtual object don't follow my hand but in a random place. it seems that there was a differnce/transform between handAnchor's origin and immersiveSpace origin under stpatial persona + shareplay mode. isn't it? Or is there something I can try to use convert(displacement.inverse.rotation, from: .immersiveSpace, to: .scene) that mentioned here: https://developer.apple.com/documentation/realitykit/realitycoordinatespaceconverting and https://developer.apple.com/documentation/swiftui/environmentvalues/immersivespacedisplacement to do the translation and apply it on my virtual object. But not working yet. Can someone tells me how to do this correctly?
2
0
768
May ’24
The behavior of hand interacts with object in Vision Pro
I am developing an immersive application featured with hands interacting my virtual objects. When my hand passes through the object, the rendered color of my hand is like blending hand color with object's color together, both semi transparent. I wonder if it is possible to make my hand be always "opaque", or say the alpha value of rendered hand (coz it's VST) is always 1, but the object's alpha value could be varied in terms of whether it is interacting with hand. (I was thinking this kind of feature might be supported by a specific component (just like HoverEffectComponent), but I didn't find that out)
0
0
481
May ’24
Default RealityViewContent has multiple anchors?
I am developing an app in mixed immersive native app on Vision Pro. In my RealityView, I add my scene by content.add(mainGameScene). Normally the anchored position (original coord) should be the device position but on the ground (with y == 0 on the ground). At least this is how I understand the RealityViewContent works. So if I place something at position (0, 0, -1.0), the object should be in the front of you but on the floor (z axis is pointing backwards) However recently I load a different scene and I add that with same code, content.add(mainGameScene), something has changed, my scene randomly anchored on the floor or ceiling, according to the places I stand or sit. When I open Visualizations of my anchoring point, I could see that anchor point I am using is on the ceiling. The correct one (around my foots) is left over there. How could I switch to the correct anchored position? Or does any setting can change the behavior of default RealityViewContent?
3
0
628
Apr ’24
I know it's sensitive to access to user's eye's data, but
Unity's PolySpatial has HoverEffect GameObject supported, I think it basically means even though I don't know the exact entity that user is looking at. But the developer can provide a event callback to RealityKit that "please change me the entity's mesh to other color". Just like hoverEffect on SwiftUI component. So I wonder if there was a closure to let RealityKit the system to fire up my callback?
1
0
584
Mar ’24
[RealityKit] Cannot OpenImmersiveView
I was developing my project just like HappyBeam, having a mechanism that after playing a small game round, a main menu pops up letting player to play again or back to main menu. When I start my first game after installing it on my vision pro, it works fine, which was basically like HappyBeam, counting from 3 to 1 then await openImmersiveView(id: "***") to entering my game. After finish the round, I call await dismissImmersiveView() and reset my game ready for the next move. Letting player to choose play again or back to menu to continue my game. However, this time, when my counter counting 3 to 1, the immersive view didn't show up and the visionOS menu showed up instead (I guess it's because immersive view cannot be open). Some error showed in my logger are below <FBSWorkspaceScenesClient:0x281820e00 com.apple.frontboard.systemappservices> scene request failed to return scene with error response : <NSError: 0x2839bc270; domain: FBSWorkspaceErrorDomain; code: 1 ("InvalidScene"); "scene invalidated before create completion"> ------------------------------------------------ Unable to present an Immersive Space for id 'ImmersiveSpace': Error Domain=FBSWorkspaceErrorDomain Code=1 "scene invalidated before create completion" UserInfo={BSErrorCodeDescription=InvalidScene, NSLocalizedFailureReason=scene invalidated before create completion} ------------------------------------------------ Error: BSLogAddStateCaptureBlockWithTitle(EventDeferringState:com.milanow.mygame:SFBSystemService-C90B0828-4522-4098-9E6A-0D5968CFCEB8) state data format error: <NSError: 0x283947360; domain: BSSharedStateCapturing; code: 1; "Input generated no data"> { NSUnderlyingError = <__NSCFError: 0x2839451d0; domain: NSCocoaErrorDomain; code: 3851> { NSDebugDescription = Property list invalid for format: 200 (property lists cannot contain NULL); }; } Wonder what happens here since no helpful info could be found online. (I think the openimmersiveview code snippet is boring but I may still post it here) var body: some Scene { Group { WindowGroup(id: "MainUI") { MainView() } .windowStyle(.plain) .windowResizability(.contentSize) ImmersiveSpace(id: "ImmersiveSpace") { GameView() } .immersionStyle(selection: .constant(.mixed), in: .mixed) } .onChange(of: gameModel.state) { oldValue, newValue in guard oldValue != newValue else { return } if case let .dismissingImmersiveSpace(finish) = newValue { Task { await dismissImmersiveSpace() openWindow(id: "MainUI") finish() } } else if case let .openingImmersiveSpace(startPlaying) = newValue { Task { await openImmersiveSpace(id: "ImmersiveSpace") dismissWindow(id: "MainUI") startPlaying() } } else if case .playing = oldValue { openWindow(id: "MainUI") } else if case .playing = newValue { dismissWindow(id: "MainUI") } } } You can see that nothing magic here just follow what HappyBeam says. (Wonder what makes a scene invalid really)
1
0
680
Feb ’24
[VisionOS] Entity spawning during the runtime won't respond to my gesture
I'm creating an immersive experience with RealityView (just consider it Fruit Ninja like experience). Saying I have some random generated fruits that were generated by certain criteria in System.update function. And I want to interact these generated fruits with whatever hand gesture. Well it simply doesn't work, the gesture.onChange function isn't fire as I expected. I put both InputTargetComponent and CollisionComponent to make it detectable in an immersive view. It works fine if I already set up these fruits in the scene with Reality Composer Pro before the app running. Here is what I did Firstly I load the fruitTemplate by: let tempScene = try await Entity(named: fruitPrefab.usda, in: realityKitContentBundle) fruitTemplate = tempScene.findEntity(named: "fruitPrefab") Then I clone it during the System.update(context) function. parent is an invisible object being placed in .zero in my loaded immersive scene let fruitClone = fruitTemplate!.clone(recursive: true) fruitClone.position = pos fruitClone.scale = scale parent.addChild(fruitClone) I attached my gesture to RealityView by .gesture(DragGesture(minimumDistance: 0.0) .targetedToAnyEntity() .onChanged { value in print("dragging") } .onEnded { tapEnd in print("dragging ends") } ) I was considering if the runtime-generated entity is not tracked by RealityView, but since I have added it as a child to a placeholder entity in the scene, it should be fine...right? Or I just needs to put a new AnchorEntity there? Thanks for any advice in advance. I've been tried it out for the whole day.
2
1
973
Oct ’23