RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Post

Replies

Boosts

Views

Activity

USDZ in Keynote
Apple's own USDZ files from their website does not display properly both in OPReview and in Keynote when imported. It displays properly, however, in Reality Converter bit with this error: "Invalid USD shader node in USD file Shader nodes must have “id” as the implementationSource, with id values that begin with “Usd”. Also, shader inputs with connections must each have a single, valid connection source." I tried importing other models from external sources and they work without any issue at all. Is there any potential fix or workaround this? Thanks in advance.
1
0
315
Sep ’24
Znear For RealityView to prevent Entity from Hiding an attachment SwiftUI Button in VisionOS
I have an app which have an Immersive Space view and it needs the user to have a button in the bottom which have a fixed place in front of the user head like a dashboard in game or so but when the user get too close to any3d object in the view it could cover the button and make it inaccessible and it mainly would prevent the app for being approved like that in appstoreconnect I was working before on SceneKit and there was something like camera view Znear and Zfar which decide when to hide the 3d model if it comes too close or gets too far and I wonder if there is something like that in realityView / RealityKit 4. Here is My Code and the screenshots follows import SwiftUI import RealityKit struct ContentView: View { @State var myHead: Entity = { let headAnchor = AnchorEntity(.head) headAnchor.position = [-0.02, -0.023, -0.24] return headAnchor }() @State var clicked = false var body: some View { RealityView { content, attachments in // create a 3d box let mainBox = ModelEntity(mesh: .generateBox(size: [0.1, 0.1, 0.1])) mainBox.position = [0, 1.6, -0.3] content.add(mainBox) content.add(myHead) guard let attachmentEntity = attachments.entity(for: "Dashboard") else {return} myHead.addChild(attachmentEntity) } attachments: { // SwiftUI Inside Immersivre View Attachment(id: "Dashboard") { VStack { Spacer() .frame(height: 300) Button(action: { goClicked() }) { Text(clicked ? "⏸️" : "▶️") .frame(maxWidth: 48, maxHeight: 48, alignment: .center) .font(.extraLargeTitle) } .buttonStyle(.plain) } } } } func goClicked() { clicked.toggle() } }
2
0
574
Sep ’24
Exception while creating a PhotogrammetrySession object with PhotogrammetrySamples
Hi, I am trying to use PhotogrammetrySample input sequence according to the WWDC video. I modified only the following code in the sample: var images = try FileManager.default.contentsOfDirectory(at: captureFolderManager.imagesFolder, includingPropertiesForKeys: nil, options: .skipsHiddenFiles) let inputSequence = images.lazy.compactMap { file in return self.loadSampleAndMask(file: file) } // Next line causes the exception photogrammetrySession = try PhotogrammetrySession( input: inputSequence, configuration: configuration) private func loadSampleAndMask(file: URL) -> PhotogrammetrySample? { do { var sample = try PhotogrammetrySample(contentsOf: file) return sample } catch { return nil } } I am getting following runtime error: Thread 1: EXC_BREAKPOINT (code=1, subcode=0x240d88904)
0
0
283
Sep ’24
sceneUnderstanding occlusion not work with blendMode = alpha in ios 18
If I import a USDZ model with blendMode set to alpha, occlusion does not work on iPhone with iOS 18. How should transparent materials and occlusion be properly used in the new RealityKit? Additionally, new artifacts have appeared when working with transparent objects overlapping each other. The transparency results do not blend but rather parts of the model just not rendering.
6
0
656
Sep ’24
Getting vector3 or vector4 per-vertex data into Shader Graph Editor
Hi, I'm trying to understand how I can get 3- or 4-channel per-vertex data into the Graph Editor. From my tests, it seems that: the "Geometric Property" node does not give access to 4-channel data, "Geometric Property (vector3)" does not give me access to custom properties besides the ones defined in MaterialX core the "Texture Coordinates" node has a vector4f mode (yay!), but according to MaterialX spec, texcoords must have 2 or 3 channels, and I can't get 4-channel data to show up there either. My assumption so far is that I must be missing some "magic" – for example, do the primvars in a file have to be in a specific order, independent of their names? Or do their names matter? (E.g. convention would be primars:st and primvars:st1 and so on) Unfortunately the forum doesn't allow me to attach any USDZ or ZIP files or GDrive links; if there's a way to share a test file I'm happy to do so!
0
0
329
Sep ’24
VisionOS crashes Loading Entities from Disk Bundles with EXC_BREAKPOINT
Summary: I’m working on a VisionOS project where I need to dynamically load a .bundle file containing RealityKit content from the app’s Application Support directory. The .bundle is saved to disk after being downloaded or retrieved as an On-Demand Resource (ODR). Sample project with the issue: Github repo. Play the target test-odr to use with the local bundle and have the crash. Overall problem: Setup: Add a .bundle named RealityKitContent_RealityKitContent.bundle to the app’s resources. This bundle contains a Reality file with two USDA,: “Immersive” and “Scene”. Save to Disk: save the bundle to the Application Support directory, ensuring that the file is correctly copied and saved. Load the Bundle: load the bundle from the saved URL using Bundle(url: bundleURL) to initialize the Bundle object. Load Entity from Bundle: load a specific entity (“Scene”) from the bundle. When trying to load the entity using let storedEntity = try await Entity(named: "Scene", in: bundle), the app crashes with an EXC_BREAKPOINT error. ContentsOf Method Issue: If I use the Entity.load(contentsOf:realityFileURL, withName: entityName) method, it always loads the first root entity found (in this case, “Immersive”) rather than “Scene”, even when specifying the entity name. This is why I want to use the Bundle to load entities by name more precisely. Issue: The crash consistently occurs on the Entity(named: "Scene", in: bundle) line. I have verified that the bundle exists and is accessible at the specified path and that it contains the expected .reality file with multiple entities (“Immersive” and “Scene”). The error code I get is EXC_BREAKPOINT (code=1, subcode=0x1d135d4d0). What I’ve Tried: • Ensured the bundle is properly saved and accessible. • Checked that the bundle is initialized correctly from the URL. • Tested loading the entity using the contentsOf method, which works fine but always loads the “Immersive” entity, ignoring the specified name. Hence, I want to use the Bundle-based approach to load multiple USDA entities selectively. Question: Has anyone faced a similar issue or knows why loading entities using Entity(named:in:) from a disk-based bundle causes this crash? Any advice on how to debug or resolve this, especially for managing multiple root entities in a .reality file, would be greatly appreciated.
1
0
425
Sep ’24
Can an SDF be rendered using RealityKit?
I'm trying to ray-march an SDF inside a RealityKit surface shader. For the SDF primitive to correctly render with other primitives, the depth of the fragment needs to be set according to the ray-surface intersection point. Is there a way to do that within a RealityKit surface shader? It seems the only values I can set are within surface::surface_properties. If not, can an SDF still be rendered in RealityKit using ray-marching?
1
1
469
Sep ’24
RealityKit 3D texture
In my Metal-based app, I ray-march a 3D texture. I'd like to use RealityKit instead of my own code. I see there is a LowLevelTexture (beta) where I could specify a 3D texture. However on the Metal side, there doesn't seem to be any way to access a 3D texture (realitykit::texture::textures::custom returns a texture2d). Any work-arounds? Could I even do something icky like cast the texture2d to a texture3d in MSL? (is that even possible?) Could I encode the 3d texture into an argument buffer and get that in somehow?
1
0
442
Aug ’24
Pink Screen with VideoMaterial in ARKit
Hi everyone, I'm developing an ARKit app using RealityKit and encountering an issue where a video displayed on a 3D plane shows up as a pink screen instead of the actual video content. Here's a simplified version of my setup: func createVideoScreen(video: AVPlayerItem, canvasWidth: Float, canvasHeight: Float, aspectRatio: Float, fitsWidth: Bool = true) -> ModelEntity { let width = (fitsWidth) ? canvasWidth : canvasHeight * aspectRatio let height = (fitsWidth) ? canvasWidth * (1/aspectRatio) : canvasHeight let screenPlane = MeshResource.generatePlane(width: width, depth: height) let videoMaterial: Material = createVideoMaterial(videoItem: video) let videoScreenModel = ModelEntity(mesh: screenPlane, materials: [videoMaterial]) return videoScreenModel } func createVideoMaterial(videoItem: AVPlayerItem) -> VideoMaterial { let player = AVPlayer(playerItem: videoItem) let videoMaterial = VideoMaterial(avPlayer: player) player.play() return videoMaterial } Despite following the standard process, the video plane renders pink. Has anyone encountered this before, or does anyone know what might be causing it? Thanks in advance!
4
0
463
Aug ’24
The elements in the attachment cannot add translation.
UI: Attachment(id: "tooptip") { if isRecording { TooltipView { HStack(spacing: 8) { Image(systemName: "waveform") .font(.title) .frame(minWidth: 100) } } .transition(.opacity.combined(with: .scale)) } } Trigger: Button("Toggle") { withAnimation{ isRecording.toggle() } } The above code did not show the animation effect when running. When I use isRecording to drive an element in a common SwiftUI view, there is an animation effect.
0
0
306
Aug ’24
Taking snapshots from WKWebView on visionos
Hi, I'm trying to capture some images from WKWebView on visionOS. I know that there's a function 'takeSnapshot()' that can get the image from the web page. But I wonder if 'drawHierarchy()' cannot work properly on WKWebView because of GPU content, is there any other methods I can call to capture images correctly? Furthermore, as I put my webview into an immersive space, is there any way I can get the texture of this UIView attachment? Thank you
0
0
289
Aug ’24
ModelEntity with skeletal pose is missing pins
Starting with Xcode Beta 4+, any ModelEntity I load from usdz that contain a skeletal pose has no pins. The pins used to be accessible from a ModelEntity so you could use alignment with other pins. Per the documentation, any ModelEntity with a skeletal pose should have pins that are automatically generated and contained on the entity.pins object itself. https://developer.apple.com/documentation/RealityKit/Entity/pins Is this a bug with the later Xcode betas or is the documentation wrong?
0
0
268
Aug ’24
Rounding errors in last row of entity transform
I'm trying to clone an entity that's somewhere deeper in hierarchy and I want it together with transform that takes into account parents. Initially I made something that would go back through parents, get their transforms and then reduce them to single one. Then I realized that what I'm doing is same as .transformMatrix(relativeTo: rootEntity), but to validate that what I made gives same results I started to print them both and I noticed that for some reason the last row instead of stable (0,0,0,1) is sometimes (0,0,0,0.9999...). I know that there are rounding errors, but I'd assume that 0 and 1 are "magical" in FP world. The only way I can try to explain it, is that .transformMatrix is using some fancy accelerated matrix multiplication and those produce some bigger rounding errors. That would explain slight differences in other fields between my version and function call, but still - the 1 seems weird. Here's function I'm using to compare: func cloneFlattened(entity: Entity, withChildren recursive: Bool) -> Entity { let clone = entity.clone(recursive: recursive) var transforms = [entity.transform.matrix] var parent: Entity? = entity.parent var rootEntity: Entity = entity while parent != nil { rootEntity = parent! transforms.append(parent!.transform.matrix) parent = parent!.parent } if transforms.count > 1 { clone.transform.matrix = transforms.reversed().reduce(simd_diagonal_matrix(simd_float4(repeating: 1)), *) print("QWE CLONE FLATTENED: \(clone.transform.matrix)") print("QWE CLONE RELATIVE : \(entity.transformMatrix(relativeTo: rootEntity))") } else { print("QWE CLONE SINGLE : \(clone.transform.matrix)") } return clone } Sometimes last one is not 1 QWE CLONE FLATTENED: [ [0.00042261832, 0.0009063079, 0.0, 0.0], [-0.0009063079, 0.00042261832, 0.0, 0.0], [0.0, 0.0, 0.0010000002, 0.0], [-0.0013045187, -0.009559666, -0.04027118, 1.0] ] QWE CLONE RELATIVE : [ [0.00042261826, 0.0009063076, -4.681872e-12, 0.0], [-0.0009063076, 0.00042261826, 3.580335e-12, 0.0], [3.4256328e-12, 1.8047965e-13, 0.0009999998, 0.0], [-0.0013045263, -0.009559661, -0.040271178, 0.9999997] ] Sometimes it is QWE CLONE FLATTENED: [ [0.0009980977, -6.1623556e-05, -1.7382005e-06, 0.0], [-6.136851e-05, -0.0009958588, 6.707259e-05, 0.0], [-5.8642554e-06, -6.683835e-05, -0.0009977464, 0.0], [-1.761913e-06, -0.002, 0.0, 1.0] ] QWE CLONE RELATIVE : [ [0.0009980979, -6.1623556e-05, -1.7382023e-06, 0.0], [-6.136855e-05, -0.0009958589, 6.707254e-05, 0.0], [-5.864262e-06, -6.6838256e-05, -0.0009977465, 0.0], [-1.758337e-06, -0.0019999966, -3.7252903e-09, 1.0] ] 0s in last row seem to be stable. It happens both for entities that are few levels deep and those that have only anchor as parent. So far I've never seen any value that would not be "technically a 1", but my hierarchies are not very deep and it makes me wonder if this rounding could get worse. Or is it just me doing something stupid? :)
0
0
290
Aug ’24
Loading a lighting resource for a RealityKit view on macOS - image not embedded in app bundle
I'm trying to load up a virtual skybox, different from the built-in default, for a simple macOS rendering of RealityKit content. I was following the detail at https://developer.apple.com/documentation/realitykit/environmentresource, and created a folder called "light.skybox" with a single file in it ("prairie.hdr"), and then I'm trying to load that and set it as the environment on the arView when it's created: let ar = ARView(frame: .zero) do { let resource = try EnvironmentResource.load(named: "prairie") ar.environment.lighting.resource = resource } catch { print("Unable to load resource: \(error)") } The loading always fails when I launch the sample app, reporting "Unable to load resource ..." and when I look in the App bundle, the resource that's included there as Contents/Resources/light.realityenv is an entirely different size - appearing to be the default lighting. I've tried making the folder "light.skybox" explicitly target the app bundle for inclusion, but I don't see it get embedded with it toggle that way, or just default. Is there anything I need to do to get Xcode to process and include the lighting I'm providing? (This is inspired from https://stackoverflow.com/questions/77332150/realitykit-how-to-disable-default-lighting-in-nonar-arview, which shows an example for UIKit)
4
0
529
Aug ’24
Vision Pinch 3D Object
Hello Guys, I'm looking for how to look at the 3D object around me, pinch it (from far with eyes and 2 fingers, not touch them) and display a little information on top of it (like little text). If you have the solution you will make me happy (and a bit less stupid too :) ) Cheers Mathis
8
1
575
Aug ’24
Reality Composer Pro 2.0 shader graphs can't be loaded on visionOS 1
Using Reality Composer Pro 2.0, I created a simple shader graph that displays a texture on an unlit surface: On visionOS 2 beta, I can successfully use ShaderGraphMaterial(named:from:in:) to load that shader graph material and assign it to a model entity. However, on visionOS 1.2 and earlier, either in Simulator or on the device, ShaderGraphMaterial(named:from:in:) fails and I see the following logged to the console: If, using Reality Composer Pro 1.0, I experimentally open the same project and delete and recreate exactly the same nodes above, then ShaderGraphMaterial(named:from:in:) works as expected on visionOS 1.2. Is it a known issue that Reality Composer 2 can't be used with visionOS 1? Is this intentional behavior? I've submitted feedback as FB14828873, including a sample project and repro steps. If possible, I would appreciate guidance from an Apple engineer, like "This is a known issue for [list of node types]" or "Reality Composer Pro 2 is not supported for visionOS 1 development, please refer to [documentation]" or "We recommend [workaround]." Thank you.
6
0
822
Aug ’24
Camera control in an immersive environment
Hello, I’m playing around with making an fully immersive multiplayer, air to air dogfighting game, but I’m having trouble figuring out how to attach a camera to an entity. I have a plane that’s controlled with a GamePad. And I want the camera’s position to be pinned to that entity as it moves about space, while maintaining the users ability to look around. Is this possible? -- From my understanding, the current state of SceneKit, ARKit, and RealityKit is a bit confusing with what can and can not be done. SceneKit Full control of the camera Not sure if it can use RealityKits ECS system. 2D Window. - Missing full immersion. ARKit Full control of the camera* - but only for non Vision Pro devices. Since Vision OS doesn't have a ARView. Has RealityKits ECS system 2D Window. - Missing full immersion. RealityKit Camera is pinned to the device's position and orientation Has RealityKits ECS system Allows full immersion
0
1
391
Aug ’24
Allow RealityView to render at 120 fps?
I'm building an iOS/iPadOS app for iOS 18+ using the new RealityView in SwiftUI. (I may add visionOS, but I'm not focusing on it right now.) The 3D scene I'm rendering is fairly simple (just a few dozen vertices and a couple of textures), and I'd like to render it at 120fps on ProMotion devices if possible. I tried setting CADisableMinimumFrameDurationOnPhone to true in the info plist, but it had no effect. The frame rate in the GPU Report in Xcode stays capped at 60fps, and the gauge even tops out at 60. My question is kind of the opposite of this post, which asks how to limit the frame rate of a RealityView. I'm on Xcode 16 beta 5 on macOS Sonoma and iOS 18.0 beta 6 on my iPhone 15 Pro.
1
0
454
Aug ’24
AirPlay Streaming Metal-Processed Output in Real TIme
Hey everyone, I'm working on an iOS app where I use AVPlayer to play videos, then process them through Metal to apply effects. The app has controls that let users tweak these effects in real-time, and I want the final processed video to be streamed via AirPlay. I use a custom rendering layer that uses a Metal texture to display the processed video on the screen an that works as intended. The problem is, when I try to AirPlay the video after feeding it the processed metal frames, it’s just streaming the original video from AVPlayer, not the version with all the Metal effects. The final processed output is a Metal texture that gets rendered in a MTKView. I even tried capturing that texture and sending it through a new AVPlayer setup, but AirPlay still grabs the original, unprocessed video instead of the final, fully-rendered output. It's also clear that the airplayed video has the full length of the original built in so it's not even that it's 'live streaming' the wrong feed. I need help figuring out how to make AirPlay stream the live, processed video with all the effects, not just the raw video. Any ideas? Happy to share my code if that helps but I'm not sure I have the right underlying approach yet. Thanks!
0
0
272
Aug ’24
Dynamic loading various USDZ files
I have various USDZ files in my visionOS app. Loading the USDZ files works quite well. I only have problems with the positioning of the 3D model. For example, I have a USDZ file that is displayed directly above me. I can't move the model or perform any other actions on it. If I sit on a chair or stand up again, the 3D model automatically moves with me. This is my source code for loading the USDZ files: struct ImmersiveView: View { @State var modelName: String @State private var loadedModel = Entity() var body: some View { RealityView { content in if let usdModel = try? await Entity(named: modelName) { print("====> \(modelName) : \(usdModel) <====") let bounds = usdModel.visualBounds(relativeTo: nil).extents usdModel.scale = SIMD3<Float>(1.0, 1.0, 1.0) usdModel.position = SIMD3<Float>(0.0, 0.0, 0.0) usdModel.components.set(CollisionComponent(shapes: [.generateBox(size: bounds)])) usdModel.components.set(HoverEffectComponent()) usdModel.components.set(InputTargetComponent()) loadedModel = usdModel content.add(usdModel) } } } } I only want the 3D models from the USDZ files to be displayed later, and later on, to be able to move them via gestures. Moving the models is step 2. First, I need to make sure the models are displayed correctly. What have I forgotten or done wrong?
2
0
554
Aug ’24