RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

Posts under RealityKit tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Can't Tap
I am attempting to execute actions after clicking an entity in Reality View using the Behaviors component. I have added the Input Target component and the Tap gesture as follows: TapGesture().targetedToAnyEntity() .onEnded({ value in _ = value.entity.applyTapForBehaviors() }) ) However, during testing, I have observed that the entity does not appear to recognize the click gesture. Could you kindly provide any relevant documentation or guidance on this matter?
1
1
393
Aug ’24
Reality Composer Pro 2.0 shader graphs can't be loaded on visionOS 1
Using Reality Composer Pro 2.0, I created a simple shader graph that displays a texture on an unlit surface: On visionOS 2 beta, I can successfully use ShaderGraphMaterial(named:from:in:) to load that shader graph material and assign it to a model entity. However, on visionOS 1.2 and earlier, either in Simulator or on the device, ShaderGraphMaterial(named:from:in:) fails and I see the following logged to the console: If, using Reality Composer Pro 1.0, I experimentally open the same project and delete and recreate exactly the same nodes above, then ShaderGraphMaterial(named:from:in:) works as expected on visionOS 1.2. Is it a known issue that Reality Composer 2 can't be used with visionOS 1? Is this intentional behavior? I've submitted feedback as FB14828873, including a sample project and repro steps. If possible, I would appreciate guidance from an Apple engineer, like "This is a known issue for [list of node types]" or "Reality Composer Pro 2 is not supported for visionOS 1 development, please refer to [documentation]" or "We recommend [workaround]." Thank you.
6
0
851
Oct ’24
Drag Gesture on Entity with PhysicsBodyComponent is not behaving properly (glitching)
Hi everyone, I'm new to Swift and VisionOS development in general, so please go easy on me. Currently, I'm looking at a sample project from a WWDC23 session that uses RealityKit and ARKit to add a cube entity to a scene via tap gesture. The link to the sample project is here. Instead of adding a cube, I changed the code to adding a usdz model instead. Here is my code: func add3DModel(tapLocation: SIMD3<Float>) { let placementLocation = tapLocation + SIMD3<Float>(0, 0.1, 0) guard let entity = try? Entity.load(named: "cake-usdz", in: realityKitContentBundle) else { logger.error("failed to load 3D model") return } // calculate the collision box (the boundaries) let entitySize = entity.visualBounds(relativeTo: nil) let width = entitySize.max.x - entitySize.min.x let height = entitySize.max.y - entitySize.min.y let depth = entitySize.max.z - entitySize.min.z // logger.debug("width: \(width), height: \(height), depth: \(depth)") // set collision shape let collisionShape = ShapeResource.generateBox(size: SIMD3<Float>(width, height, depth)) entity.components.set(CollisionComponent(shapes: [collisionShape])) // set the position and input types to indirect entity.setPosition(placementLocation, relativeTo: nil) entity.components.set(InputTargetComponent(allowedInputTypes: .indirect)) let material = PhysicsMaterialResource.generate(friction: 0.8, restitution: 0.0) entity.components.set(PhysicsBodyComponent( shapes: [collisionShape], mass: 1.0, material: material, mode: .dynamic )) contentEntity.addChild(entity) } This works fine so far. But when I tried to add a Drag Gesture to drag the added entity around. There are weird glitches happening with the model. The model jumped up and down, and even rotating around it self sometimes. Below is my code for Drag Gesture. I placed it directly below the code for Spatial Tap Gesture in the sample project. .gesture(DragGesture().targetedToAnyEntity().onChanged({ value in let targetedEntity = value.entity targetedEntity.position = value.convert(value.location3D, from: .local, to: .scene) })) At first, I thought my code was wrong. But after looking around and removing the PhysicsBodyComponent for the added model, the entity was moving as intended while dragging. I can't figure out a solution to this. Could anyone help me? I'm currently on Xcode 16 beta 2, and visionOS 2.0. Because I'm on Beta, I'm unsure if this is a bug or if I just missed something. Thank you.
1
0
403
Aug ’24
Camera control in an immersive environment
Hello, I’m playing around with making an fully immersive multiplayer, air to air dogfighting game, but I’m having trouble figuring out how to attach a camera to an entity. I have a plane that’s controlled with a GamePad. And I want the camera’s position to be pinned to that entity as it moves about space, while maintaining the users ability to look around. Is this possible? -- From my understanding, the current state of SceneKit, ARKit, and RealityKit is a bit confusing with what can and can not be done. SceneKit Full control of the camera Not sure if it can use RealityKits ECS system. 2D Window. - Missing full immersion. ARKit Full control of the camera* - but only for non Vision Pro devices. Since Vision OS doesn't have a ARView. Has RealityKits ECS system 2D Window. - Missing full immersion. RealityKit Camera is pinned to the device's position and orientation Has RealityKits ECS system Allows full immersion
0
1
402
Aug ’24
Program runs fine in Vision Pro simulator but not on the actual device
We are encountering an issue with our app on Vision Pro with OS version 1.3. The app runs perfectly in the VisionOS Simulator, but when tested on the actual device, no content is displayed. Here’s the expected behavior: When the app launches, a video should play in a window. Once the video ends, another information window should open. After a series of these information windows, we will load to an immersive space to handle 3D elements. We've set the "Preferred Default Scene Session Role" to "Window Application Session Role" in info.plist, but the issue persists. Below is the code we're using. Any advice or suggestions would be greatly appreciated. import SwiftUI @main struct myApp: App {          @StateObject var sharedData = SharedDataModel()     @State private var isFactoryEnabled = false     var body: some Scene {                  WindowGroup(id: "LaunchScreen", content: {             LaunchScreen()                          })         .windowStyle(.plain)         .environmentObject(sharedData)                  WindowGroup(id: "LoginView", content: {             ZStack {                 let _ = UserDefaults.standard.set(false, forKey: "_UIConstraintBasedLayoutLogUnsatisfiable")                 let _ = print(FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!.path)                                      LoginView()                 }         }).windowStyle(.plain)         .environmentObject(sharedData)                  WindowGroup(id: "TrainingSelection", content: {                              if !sharedData.showNavigationHintView{                     NavigationHintView()                         .glassBackgroundEffect()                         .cornerRadius(30)                                      }                 else {                     TrainingSelection()                 }         }).windowStyle(.plain)         .environmentObject(sharedData)                  WindowGroup(id: "Salutations", content: {             Salutations()         }).windowStyle(.plain)         .environmentObject(sharedData)                  WindowGroup {             ContentView()         }         .environmentObject(sharedData)         ImmersiveSpace(id: "myImmersiveSpace") {             ImmersiveView(viewModel: .init())         }         .environmentObject(sharedData)     } } import SwiftUI import AVFoundation import RealityKit import RealityKitContent struct LaunchScreen: View {     @State private var player: AVPlayer?     @State private var navigateToContentView = false     @EnvironmentObject var audioPlayer: AudioPlayer     var body: some View {         ZStack {             ZStack {                 if navigateToContentView {                     WarningView()                         .transition(.opacity)                         .glassBackgroundEffect()                         .cornerRadius(15)                 } else {                     if let player = player {                         AVPlayerView(player: player)                             .onAppear {                                 player.play()                                 addObserver()                             }                             .cornerRadius(30)                     } else {                         Text("Unable to Load the Video")                             .foregroundColor(.white)                             .onAppear {                                 loadVideo()                             }                     }                 }             }             .edgesIgnoringSafeArea(.all)             .animation(.easeIn, value: 1)         }         .glassBackgroundEffect()     }     private func loadVideo() {         if let videoUrl = Bundle.main.url(forResource: "launchScreen", withExtension: "mp4") {             player = AVPlayer(url: videoUrl)         } else {             print("Unable to Load the Video")         }     }     private func addObserver() {         NotificationCenter.default.addObserver(             forName: .AVPlayerItemDidPlayToEndTime,             object: player?.currentItem,             queue: .main         ) { _ in             self.navigateToContentView = true         }     } }
1
0
341
Aug ’24
RealityView in macOS, Skybox, and lighting issue
I am testing RealityView on a Mac, and I am having troubles controlling the lighting. I initially add a red cube, and everything is fine. (see figure 1) I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field. Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was. Is there a way to return the lighting of the model to the original lighting I had before adding the skybox? I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment. Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c) struct MyRealityView: View { @Binding var isSwitchOn: Bool @State private var blueNebulaSkyboxResource: EnvironmentResource? var body: some View { RealityView { content in // Create a red cube 10cm on a side let mesh = MeshResource.generateBox(size: 0.1) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) // Load skybox let blueNeb2Name = "BlueNeb2" blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name) } update: { content in if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) { content.environment = .skybox(blueNebulaSkyboxResource!) } else { content.environment = .default } } .realityViewCameraControls(CameraControls.orbit) } } Figure 1 (default lighting before adding the skybox): Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox): Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
1
0
388
Aug ’24
Generating MeshResource.Skeleton for IKComponent in RealityComposerPro/XCode
I'm building a VisionOS 2.0 app where the AVP user can change the position of the end effector of a robot model, which was generated in Reality Composer Pro (RCP) from Primitive Shapes. I'd like to use an IKComponent to achieve this functionality, following example code here. I am able to load my entityy and access its MeshResource following the IKComponent example code, but on the line let modelSkeleton = meshResource.contents.skeletons[0] I get an error since my MeshResource does not include a skeleton. Is there some way to directly generate the skeleton with my entities in RCP, or is there a way to add a skeleton generated in XCode to an existing MeshResource that corresponds to my entities generated in RCP? I have tried using MeshSkeletonCollection.insert() with a skeleton I generated in XCode, but I cannot figure out how to assign this skeletonCollection to the MeshResource of the entity.
5
0
344
Aug ’24
Building a custom render pipeline with RealityKit
Hello experts, and question seekers, I have been trying to get Gaussian splats working with RealityKit, however it seems not to work out for me. The library I use for Gaussian splatting: https://github.com/scier/MetalSplatter My idea was to use the renderers provided by RealityKit (aka RealityRenderer) https://developer.apple.com/documentation/realitykit/realityrenderer and the renderer provided by MetalSplatter (aka. SplatRenderer) https://github.com/scier/MetalSplatter/blob/main/MetalSplatter/Sources/SplatRenderer.swift Then with a custom render pipeline, I would be able to compose the outputs of the renderers, enabling the possibility, for example to build immersive scenery with realistic environment scans, as Gaussian splats, and RealityKit to provide the necessary features to build extra scenery around Gaussian splats, eg. dynamic 3D models inside Gaussian splats. However the problem is, as of now I am not able to do that with the current implementation of RealityRenderer. It seems to be, that first RealityRenderer is supposed to be an API, just to render colour information onto a texture, which in first glance might be useful, but misses important information, such as for example depth, and stencil information. Second issue is, even with that in mind, currently I am not able to execute RealityRenderer.updateAndRender, due to the following error messages: Could not resolve material name 'engine:BuiltinRenderGraphResources/Common/realityRendererBackground.rematerial' in bundle at '/Users//Library/Developer/CoreSimulator/Devices//data/Containers/Bundle/Application//.app'. Loading via asset path. exiting spatial tracking service update thread because wait returned 37” I was able to build a custom Metal view with UIViewRepresentable, MTKView, and MTKViewDelegate, enabling me to build a custom rendering pipeline, by utilising some of the Metal developer workflows. Reference: https://developer.apple.com/documentation/xcode/metal-developer-workflows/ Inside draw(in view: MTKView), in a class derived by MTKViewDelegate: guard let currentDrawable = view.currentDrawable else { return } let realityRenderer = try! RealityRenderer() try! realityRenderer.updateAndRender(deltaTime: 0.0, cameraOutput: .init(.singleProjection(colorTexture: currentDrawable.texture)), whenScheduled: { realityRenderer in print("Rendering scheduled") }, onComplete: { RealityRenderer in print("Rendering completed") }) Can you please tell me, what I am doing wrong? Is there any solution, that enables me to use RealityKit with for example Gaussian splats? Any help is greatly appreciated. All the best, Ethem Kurt
0
1
331
Aug ’24
Object Capture API crash frequently when start generating model
I have updated the sample code so that the scan will start generating when 15 photos r captured. I hope I can catch this error so the app wont crash.... really need help on this and thank you in advanced ! Hardware Model: iPhone14,2 OS Version: iPhone OS 17.6.1 (21G93) Exception Type: EXC_BREAKPOINT (SIGTRAP) Exception Codes: 0x0000000000000001, 0x000000023363518c Termination Reason: SIGNAL 5 Trace/BPT trap: 5 Terminating Process: exc handler [525] Triggered by Thread: 0 Thread 0 name: Thread 0 Crashed: 0 RealityKit_SwiftUI 0x000000023363518c CoveragePointCloudMiniView.interfaceOrientation.getter + 508 (CoveragePointCloudMiniView.swift:0) 1 RealityKit_SwiftUI 0x0000000233634cdc closure #1 in closure #2 in CoveragePointCloudMiniView.body.getter + 124 (CoveragePointCloudMiniView.swift:75) 2 RealityKit_SwiftUI 0x000000023363db9c partial apply for closure #1 in closure #2 in CoveragePointCloudMiniView.body.getter + 20 (:0) 3 SwiftUI 0x0000000195c4bbac closure #1 in withTransaction(::) + 276 (Transaction.swift:243) 4 SwiftUI 0x0000000195c4ba90 partial apply for closure #1 in withTransaction(::) + 24 (:0) 5 libswiftCore.dylib 0x00000001903f8094 withExtendedLifetime<A, B>(::) + 28 (LifetimeManager.swift:27) 6 SwiftUI 0x0000000195b17d78 withTransaction(::) + 72 (Transaction.swift:228) 7 SwiftUI 0x0000000195b17d04 withAnimation(::) + 116 (Transaction.swift:280) 8 RealityKit_SwiftUI 0x0000000233634bfc closure #2 in CoveragePointCloudMiniView.body.getter + 664 (CoveragePointCloudMiniView.swift:73) 9 SwiftUI 0x0000000195bef134 closure #1 in closure #1 in SubscriptionView.Subscriber.updateValue() + 72 (SubscriptionView.swift:66) 10 SwiftUI 0x0000000195b3f57c thunk for @escaping @callee_guaranteed () -> () + 28 (:0) 11 SwiftUI 0x0000000195b3c864 static Update.dispatchActions() + 1140 (Update.swift:151) 12 SwiftUI 0x0000000195b3bedc static Update.end() + 144 (Update.swift:58) 13 SwiftUI 0x0000000195a691fc closure #1 in SubscriptionView.Subscriber.updateValue() + 700 (SubscriptionView.swift:66) 14 SwiftUI 0x0000000195a68eb0 partial apply for thunk for @escaping @callee_guaranteed (@in_guaranteed A.Publisher.Output) -> () + 28 (:0) 15 SwiftUI 0x0000000195a68e78 closure #1 in ActionDispatcherSubscriber.respond(to:) + 76 (SubscriptionView.swift:98) 16 SwiftUI 0x0000000195a68c80 ActionDispatcherSubscriber.respond(to:) + 816 (SubscriptionView.swift:97) 17 SwiftUI 0x0000000195a68938 ActionDispatcherSubscriber.receive(:) + 16 (SubscriptionView.swift:110) 18 SwiftUI 0x0000000195a6786c SubscriptionLifetime.Connection.receive(:) + 100 (SubscriptionLifetime.swift:195) 19 Combine 0x000000019aed29d4 Publishers.Autoconnect.Inner.receive(:) + 52 (Autoconnect.swift:142) 20 Combine 0x000000019aed2928 Publishers.Multicast.Inner.receive(:) + 244 (Multicast.swift:211) 21 Combine 0x000000019aed2828 protocol witness for Subscriber.receive(_:) in conformance Publishers.Multicast<A, B>.Inner + 24 (:0) .... (FBSScene.m:812) 46 FrontBoardServices 0x00000001aa892844 __94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke_2 + 152 (FBSWorkspaceScenesClient.m:692) 47 FrontBoardServices 0x00000001aa8926cc -[FBSWorkspace _calloutQueue_executeCalloutFromSource:withBlock:] + 168 (FBSWorkspace.m:411) 48 FrontBoardServices 0x00000001aa8977fc __94-[FBSWorkspaceScenesClient _queue_updateScene:withSettings:diff:transitionContext:completion:]_block_invoke + 344 (FBSWorkspaceScenesClient.m:691) 49 libdispatch.dylib 0x00000001999aedd4 _dispatch_client_callout + 20 (object.m:576) 50 libdispatch.dylib 0x00000001999b286c _dispatch_block_invoke_direct + 288 (queue.c:511) 51 FrontBoardServices 0x00000001aa893d58 FBSSERIALQUEUE_IS_CALLING_OUT_TO_A_BLOCK + 52 (FBSSerialQueue.m:285) 52 FrontBoardServices 0x00000001aa893cd8 -[FBSMainRunLoopSerialQueue _targetQueue_performNextIfPossible] + 240 (FBSSerialQueue.m:309) 53 FrontBoardServices 0x00000001aa893bb0 -[FBSMainRunLoopSerialQueue performNextFromRunLoopSource] + 28 (FBSSerialQueue.m:322) 54 CoreFoundation 0x0000000191adb834 CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION + 28 (CFRunLoop.c:1957) 55 CoreFoundation 0x0000000191adb7c8 __CFRunLoopDoSource0 + 176 (CFRunLoop.c:2001) 56 CoreFoundation 0x0000000191ad92f8 __CFRunLoopDoSources0 + 340 (CFRunLoop.c:2046) 57 CoreFoundation 0x0000000191ad8484 __CFRunLoopRun + 828 (CFRunLoop.c:2955) 58 CoreFoundation 0x0000000191ad7cd8 CFRunLoopRunSpecific + 608 (CFRunLoop.c:3420) 59 GraphicsServices 0x00000001d65251a8 GSEventRunModal + 164 (GSEvent.c:2196) 60 UIKitCore 0x0000000194111ae8 -[UIApplication run] + 888 (UIApplication.m:3713) 61 UIKitCore 0x00000001941c5d98 UIApplicationMain + 340 (UIApplication.m:5303) 62 SwiftUI 0x0000000195ccc294 closure #1 in KitRendererCommon(:) + 168 (UIKitApp.swift:51) 63 SwiftUI 0x0000000195c78860 runApp(:) + 152 (UIKitApp.swift:14) 64 SwiftUI 0x0000000195c8461c static App.main() + 132 (App.swift:114) 65 SoleFit 0x0000000103046cd4 static SoleFitApp.$main() + 24 (SoleFitApp.swift:0) 66 SoleFit 0x0000000103046cd4 main + 36 67 dyld 0x00000001b52af154 start + 2356 (dyldMain.cpp:1298)
1
0
331
Aug ’24
Dynamic loading various USDZ files
I have various USDZ files in my visionOS app. Loading the USDZ files works quite well. I only have problems with the positioning of the 3D model. For example, I have a USDZ file that is displayed directly above me. I can't move the model or perform any other actions on it. If I sit on a chair or stand up again, the 3D model automatically moves with me. This is my source code for loading the USDZ files: struct ImmersiveView: View { @State var modelName: String @State private var loadedModel = Entity() var body: some View { RealityView { content in if let usdModel = try? await Entity(named: modelName) { print("====> \(modelName) : \(usdModel) <====") let bounds = usdModel.visualBounds(relativeTo: nil).extents usdModel.scale = SIMD3<Float>(1.0, 1.0, 1.0) usdModel.position = SIMD3<Float>(0.0, 0.0, 0.0) usdModel.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)])) usdModel.components.set(HoverEffectComponent()) usdModel.components.set(InputTargetComponent()) loadedModel = usdModel content.add(usdModel) } } } } I only want the 3D models from the USDZ files to be displayed later, and later on, to be able to move them via gestures. Moving the models is step 2. First, I need to make sure the models are displayed correctly. What have I forgotten or done wrong?
2
0
572
Oct ’24
Loading USDZ asset into Model3D causes visionOS 2.0 beta 5 to crash
We've recently discovered that our app crashes on startup on the latest visionOS 2.0 beta 5 (22N5297g) build. In fact, the entire field of view would dim down and visionOS would then restart, showing the Apple logo. Interestingly, no app crash is reported by Xcode during debug. After investigation, we have isolated the issue to a specific USDZ asset in our app. Loading it in a sample, blank project also causes visionOS to reliably crash, or become extremely unresponsive with rendering artifacts everywhere. This looks like a potentially serious issue. Even if the asset is problematic, loading it should not crash the entire OS. We have filed feedback FB14756285, along with a demo project. Hopefully someone can take a look. Thanks!
2
1
328
Aug ’24
RealityView world tracking without camera feed?
Is it possible with iOS 18 to use RealityView with world tracking but without the camera feed as background? With content.camera = .worldTracking the background is always the camera feed, and with content.camera = .virtual the device's position and orientation don't affect the view point. Is there a way to make a mixture of both? My use case is that my app "Encyclopedia GalacticAR" shows astronomical objects and a skybox (a huge sphere), like a VR view of planets, as you can see in the left image. Now that iOS 18 offers RealityView for iOS and iPadOS, I would like to make use of it, but I haven't found a way to display my skybox as environment, instead of the camera feed. I filed the suggestion FB14734105 but hope that somebody knows a workaround...
4
1
658
Aug ’24
How can I create a laser shader?
I am trying to make a shader that resembles a laser like this: I've been experimenting with a basic Fresnel shader to start, but the Fresnel shader has a problem at high viewing angles where the top has a very different color than the rest of the capsule. This might work for a laser shader once inverted and fine tuned: However, when viewed from the top, it doesn't look so good anymore: Ideally, the purple edge is always ONLY on the edge, and the rest of the surface is the light pink color, no matter the viewing angle. How can I accomplish this to create something that looks like a laser?
0
0
410
Aug ’24
Game Center breaks RealityView world tracking
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5? I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this: content.camera = .worldTracking However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line GKLocalPlayer.local.authenticateHandler = { viewController, error in // ... some more code ... } So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
2
1
484
Aug ’24
Collision Detection after Object Tracking
So I am tracking 2 objects in my scene, and spawning a tiny arrow on each of the objects (this part is working as intended). Inside my scene I have added Collision Components and Physics Body Components to each of the arrows. I want to detect when when a collision occurs between the 2 arrow entities.. I have made the collision boxes big enough so they should definitely be overlapping, however I am not able to detect when the Collision occurs. This is the code that I use for the scene - import SwiftUI import RealityKit import RealityKitContent struct DualObjectTrackingTest: View { @State private var subscription: EventSubscription? var body: some View { RealityView { content in if let immersiveContentEntity = try? await Entity(named: "SceneFind.usda", in: realityKitContentBundle) { content.add(immersiveContentEntity) print("Collision check started") } } update: { content in if let arrow = content.entities.first?.findEntity(named: "WhiteArrow") as? ModelEntity { let subscription = content.subscribe(to: CollisionEvents.Began.self, on: arrow) { collisionEvent in print("Collision has occured") } } } } } All I see in my console logs is "Collision check started" and then whenever I move the 2 objects really close to each other so as to overlap the collision boxes, I don't see any updates in the logs. Can anyone give me some further guidance/resources on this? Thanks again!
5
0
551
Aug ’24
Button in the attachment not clickable after adding BillboardComponent
I created some attachments by following the Diorama Apple example. Things have been working fine. I wanted to add BillboardComponent to my attachments. So I added it in this way guard let attachmentEntity = attachments.entity(for: component.attachmentTag) else { return } guard attachmentEntity.parent == nil else {return} var billBoard = BillboardComponent() billBoard.rotationAxis = [0,1,0] attachmentEntity.components.set(billBoard) content.add(attachmentEntity) attachmentEntity.setPosition([0.0, 0.5, 0.0], relativeTo: entity) My attachment view is like this Text(name) .matchedGeometryEffect(id: "Name", in: animation) .font(titleFont) Text(description) .font(descriptionFont) Button("Done") { viewModel.arrows.remove(at: 0) } } If I remove the BillboardComponent then button click works fine. but with the `BillboardComponent button click doesn't work (not even highlighting when I look at it) in certain directions. How to resolve this issue?
1
0
309
Aug ’24
Entity Coordinates in Object Tracking
A second post on the same topic, as I feel I may have over complicated the earlier one. I essentially am performing object tracking inside Reality Composer Pro and adding a digital entity to the tracked object. I now want to get the coordinates of this digital entity inside Xcode.. Secondly, can I track more than 1 object inside the same scene? For example if I want to find a spanner and a screwdriver amongst a bunch of tools laid out on the table, and spawn an arrow on top of the spanner and the screwdriver, and then get the coordinates of the arrows that I spawn, how can I go about this?
3
0
421
Aug ’24