RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

Posts under RealityKit tag

200 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Tracked object coordinates in program
Hey, as a follow up to my earlier posts about object tracking on visionOS 2 - I'm doing some experimentation, and my use-case/requirements require me to track the coordinates of some digital entity that I attach (relative to my reference object) to my reference object. Can something like this be done? Right now, all I'm doing is putting my reference object in my scene, and then positioning the 3D content that I want to show at the corresponding locations on the reference object. I am then loading the scene in a RealityView block via my SwiftUI code. I want to know now if I can also extract and use the coordinates of the digital entity that I have placed (post object-tracking), and then make some manipulations via code, for example, if the physical coordinates of the digital entity is in a certain x,y,z range -> trigger this function/bring up this alert message in a tile.. Is something like this possible, and if so, can you help me with understanding different aspects to this problem via code with some sample/reference code? So far I've only done most of the object tracking related tasks via the Reality Composer Pro, but this task that I'm trying to implement will require me to do quite a bit of programming as well, and I'm kinda lost as to how to start and go about this. Thanks for any help that ya'll can give me!
1
0
305
Aug ’24
RealityKit ShaderGraphMaterial parameters in Reality Composer Pro
I have a custom material using Shader Graph in Reality Composer Pro, and I am trying to rig up sliders to values to control the shader. I am able to read the values from the Shader Graph without a problem, and I can even update them when setting them from the LLDB command line and then getting the values back. But the changes are not reflected in the graphics. Is there some sort of update() method or something that is required to read the changed parameter values? On a related note, I am trying to understand what the MaterialParameters.Handle property is and why one would access a MaterialParameter via the handle vs just the name.
1
0
497
Aug ’24
It cannot be looped after enlarging the window.
I have developed a code that initiates the Timeline in the Reality Composer Pro scene every 12.93 seconds. RealityView { … } .onAppear { startTimer() } .onDisappear { stopTimer() } func startTimer() { timer = Timer.scheduledTimer(withTimeInterval: 12.93, repeats: true) { _ in action() } } func stopTimer() { timer?.invalidate() } func action() { print(“SunUpDown”) NotificationCenter.default.post( name: NSNotification.Name(“RealityKit.NotificationTrigger”), object: nil, userInfo: [ “RealityKit.NotificationTrigger.Scene”: scene as Any, “RealityKit.NotificationTrigger.Identifier”: “SunUpDown” ] ) } Upon receiving the “SunUpDown” command, Timeline will be executed. However, everything was functioning normally when I was running the scene, and I could continue looping until I attempted to zoom in on the window and discovered that it ceased looping. Could you please provide an explanation for this behavior? Note: The window type is volumetric, and the parameter of the defaultWorldScaling modifier is dynamic.
1
0
410
Aug ’24
HoverState in RealityView
In a scenario involving one of the entities in a Reality Composer Pro environment, I intend for this entity to display a blue material when viewed by the user. To achieve this, I have added the following Shader Graphs to the materials associated with this entity: Additionally, I have included the HoverEffectComponent component to the Reality View in the code: RealityView { content in if let model = try? await Entity(named: “WorldScene”, in: realityKitContentBundle) { let hoverEffect = HoverEffectComponent(.shader(.default)) model.components.set(hoverEffect) content.add(model) } } However, hover this entity, I am unable to observe any visual reaction. Could you please provide guidance on how to resolve this issue?
1
0
471
Aug ’24
visionOS Move Bug
I was making a gesture to let the goose (character) walk, but I had two problems. 1: I added collision and physical body components to the goose and the collided entity, but I found that those physical formations could not completely block the way of the goose. For example, a tree is in front of it. After the goose is blocked, it will cross the tree or run to the top of the tree as long as it is a little faster. 2: Because the knowledge I have accumulated is not very complete, I can control the movement of the goose on the z-axis. I hope that the user's gestures can be realized by dragging back and forth (z-axis), but I can only realize the user's gestures by dragging up and down (y-axis). I hope you can give me some guidance: GooseOriginalPosition.z + Float(translation.height / 10000) This is the complete code: @State var goose: Entity? @State var isDraggingGoose = false @State var gooseOriginalPosition = SIMD3<Float>(repeating: 0) RealityView { content in if let model = try? await Entity(named: "WorldScene", in: realityKitContentBundle) { content.add(model) } if let gooseEntity = try? await Entity(named: "Goose", in: realityKitContentBundle) { gooseEntity.scale = SIMD3<Float>(repeating: 0.3) content.add(gooseEntity) goose = gooseEntity } } .simultaneousGesture(DragGesture() .targetedToAnyEntity() .onChanged { value in handleDrag(value) } .onEnded { _ in isDraggingGoose = false gooseTimer?.invalidate() }) func handleDrag(_ value: EntityTargetValue<DragGesture.Value>) { guard let goose = goose else { return } if !isDraggingGoose { isDraggingGoose = true gooseOriginalPosition = goose.position(relativeTo: nil) } let translation = value.gestureValue.translation let newPosition = SIMD3<Float>( gooseOriginalPosition.x + Float(translation.width / 10000), gooseOriginalPosition.y, gooseOriginalPosition.z + Float(translation.height / 10000)//I hope the gesture here should be z-axis drag. ) goose.setPosition(newPosition, relativeTo: nil) }
0
0
306
Aug ’24
USD animation support in RealityKit vs. macOS Preview
This question is about USD animations playing correctly in macOS Preview but not with RealityKit on visionOS. I have a USD file created with 3D Studio Max that contains mesh-based smoke animation: https://drive.google.com/file/d/1L7Jophgvw0u0USSv-_0fPGuCuJtapmzo/view (5.6 MB) Apple's macOS 14.5 Preview app is able to play the animation correctly: However, when a visionOS app uses RealityKit to load that same USD file in visionOS 2.0 beta 4, built with 16.0 beta 3 (16A5202i), and Entity/playAnimation is called, the animation does not play as expected: This same app is able to successfully play animation of a hierarchy of solid objects read from a different USD file. When I inspect the RealityKit entities loaded from the USD file, the ground plane entity is a ModelEntity, as expected, but the smoke entity type is Entity, with no associated geometry. Why is it that macOS Preview can play the animation in the file, but RealityKit cannot? Thank you for considering this question.
1
0
688
Jul ’24
Physics Body Components within a hierarchy behave weird
Ok, I am loading an object from a Reality Composer Pro scene that has two entities inside its hierarchy that both have a Physics Body and a Collision component like this Root Outer Box Mesh Hinge + physics(static/kinematic) + collision Door. + physics(dynamic) + collision I tried to keep the physics/collision components only to the hinge and the door while I move the root or the outer box via code around. The behaviour I see is that it either moves the hinge and the door around relative to the top level (despite me checking the movement locking) OR starts rotating! the root or outer box even though I only set its position. What is the correct setup in this case? What I want is that I can move the whole object around and settle it somewhere and still have the door pinned at a fixed relative position and have one degree freedom on the hinge axis. I know how to do it in code but I really want to use the build in Reality Composer Pro settings/components. I am using the latest beta 4.
0
0
289
Jul ’24
DragGesture for RealityView
I used such a gesture under a reality view. DragGesture().targetedToAnyEntity() .onChanged { value in print("DragGesture") self.dragOffset = value.translation self.startTimer() } .onEnded { _ in self.dragOffset = .zero self.direction = "None" self.stopTimer() } However, due to the special nature of Reality View, it is impossible to detect gestures normally, so I think some modifiers should be added after value.translation, but I don't know what modifiers are. Can you give me some? Do you know? Thank you.
1
0
419
Jul ’24
VisionOS animation on USDZ
Hello all, I'm developing an application for visionOS and I'm trying to implement 2 different animations: First animation Initially, I have a map that should not be visible. I would like to create an animation effect where it appears as if a drop of water falls in the center of the map and the expanding waves gradually reveal the entire map. Is there a way to do it directly on SwiftUI or I need an animation on my USDZ? Second animation I want an animation effect similar to a cinema screen opening from the center, gradually revealing a video that was initially hidden. Is there a way to do it directly on SwiftUI? Can someone help me with this topic? Thanks ;)
2
1
371
Jul ’24
Objects do not behave properly with indirect pinch in a Mixed Reality Environment
In Mixed Reality Mode there is strange issues with indirect pinches on objects. If a user uses an indirect pinch to select an object and then walks around, or moves and re-orients their body while maintaining the pinch, the object moves as if there is some scalar being applied to it and it causes the object to behave in ways that are extremely counter-intuitive compared to other MR devices. If a user indirect pinches on an object and then walks forward the object flies away from the user, faster than they are walking. If a user indirect pinches on an object and then walks backward, the object flies towards and eventually past the user, faster than they are walking. If a user indirect pinches an object and then turns around, the object rotates around some unknown position and with some added scalar resulting in very strange behavior. Here are some examples of the issue in action. The first video is using Unity's Polyspatial SDK. The second video is using an entirely native stack of SwiftUI and RealityKit with NO Unity at all. For some reason I am not allowed to link videos here from Drive or Gyazo, so I am including it in plaintext for now. If someone could direct me how I can upload video examples of what I am describing directly to these forums, I would appreciate it. First Video Showing Issue in Unity with PolySpatial SDK: https://i.gyazo.com/95788cf9d4587c167b544db031fbf412.mp4 Second Video Showing Issue in native only stack with RealityKit and Swift UI: https://drive.google.com/file/d/1mgt8TXJiopbm6qdJw2rFG0geam0irnMn/view?usp=sharing Unity Forum Bug Discussion which, after Investigation, Confirmed this issue is on the Native Platform: https://discussions.unity.com/t/objects-do-not-behave-properly-when-manipulated-in-an-mr-space/1482439 For a Mixed Reality Environment, where a user may want to move around their space, while using Indirect Pinches to manipulate and "carry" objects with them this is a big issue. Thank you
4
0
540
Jul ’24
VideoMaterial to display SBS Stereoscopic 3D video? [VisionOS]
Hi, I love VideoMaterial API that gives so much power to play video on any mesh. But I am trying to play a side-by-side 3D video usingVideoMaterial: RealityView { content in let mesh = MeshResource.generatePlane(width: 300.0, height: 300.0, cornerRadius: 0) //generate mesh let vidMaterial = VideoMaterial(avPlayer: AVPlayer(url: URL(string: "https://someurl/test/master.m3u8")!)) //VideoMaterial vidMaterial.controller.preferredViewingMode = .stereo //<-- no idea why it doesn't work for SBS video in simulator vidMaterial.avPlayer?.play() let planeEntity = Entity() //new entity planeEntity.components.set(ModelComponent(mesh: mesh, materials: [vidMaterial])) //set a new ModelComponent to the entity content.add(planeEntity) } this code works well for plain 2D video playback but how do I display a Side-by-Side or Top-Bottom 3D video? I found GeometrySwitchCameraIndex in custom ShaderGraphMaterial but if I use input node as a image texture then how do I pass the video frame as texture into my custom shader to achieve the 3D effect or maybe there is an even better way to deal with this? There seems to be additional API .preferredViewingMode on the VideoMaterial's controller that can be set to .stereo but it doesn't give any stereo effect. Perhaps it's only for MV-HEVC media playback?
1
0
667
Jul ’24
How to Use RealityKitContent Package for App Targets Lower Than iOS 18.0
I am trying to display a 3D model in iOS app using RealityView. The same 3D model is displayed successfully in the visionOS app. Everything works perfectly only when I set my project’s minimum deployment target to iOS 18.0. However, my app’s minimum deployment target is iOS 15.0. When I use the RealityKitContent package to load the 3D model, it fails to compile and gives me the following error: Compiling for iOS 15.0, but module 'RealityKitContent' has a minimum deployment target of iOS 18.0: /Users/Library/Developer/Xcode/DerivedData/RealityViewForiOS-cbfkgimsqngtuegqwvezusvscllf/Index.noindex/Build/Products/Debug-iphonesimulator/RealityKitContent.swiftmodule/arm64-apple-ios-simulator.swiftmodule I have made the RealityKitContent package optional and tried importing using the following condition: #if canImport(RealityKitContent) import RealityKitContent #endif Despite this, it still fails to compile and produces the same error. I have not found a workaround for using the RealityKitContent package with app targets lower than iOS 18.0. Here is my package definition: let package = Package( name: "RealityKitContent", platforms: [ .visionOS(.v1), .macOS(.v15), .iOS(.v18) ], products: [ .library( name: "RealityKitContent", targets: ["RealityKitContent"]), ], dependencies: [], targets: [ .target( name: "RealityKitContent", dependencies: []), ] ) Here is the code I am using to load the 3D model with RealityView using the RealityKitContent package: import SwiftUI import RealityKit #if canImport(RealityKitContent) import RealityKitContent #endif struct ContentView: View { var body: some View { VStack { if #available(iOS 18.0, *) { RealityView { content in if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(scene) } } update: { content in if let scene = content.entities.first { let uniformScale: Float = 3.0 scene.transform.scale = [uniformScale, uniformScale, uniformScale] } } } else { // Fallback for earlier versions } } } } #Preview { ContentView() } Any help or guidance on how to use the RealityKitContent package for app targets lower than iOS 18.0 would be greatly appreciated.
0
2
426
Jul ’24
How to add gestures to objects inside other objects
I have a scene with multiple RealityKit entities. There is a blue cube which I want to rotate along with all of its children (it's partly transparent). Inside the cube are a number of child entities (red) that I want to tap. The cube and red objects all have collision components as is required for gestures to work. If I want to rotate the blue cube, and also tap the red objects I can't do this as the blue cube's collision component intercepts the taps. Is there a way of accomplishing what I want? I'm targeting visionOS 2, and my scene is in a volume.
1
0
442
Aug ’24