Posts

Post not yet marked as solved
4 Replies
504 Views
Hello, I've been tinkering with PortalComponent on visionOS a bit but noticed that the content of the WorldComponent is always clipped to the mesh geometry of whatever entities have the PortalComponent applied. Now I'm wondering if there is any way or trick to allow contents of the portal to peek out – similar to the Encounter Dinosaurs experience on Vision Pro (I assume it also uses PortalComponent?). I saw that PortalComponent has a clippingPlane property (https://developer.apple.com/documentation/realitykit/portalcomponent/clippingplane-swift.property). But so far I haven't been able to achieve a perceptible visual difference with it. If possible I would like to avoid hacky tricks using duplicate meshes or similar to achieve this. Thanks for any hints!
Posted Last updated
.
Post not yet marked as solved
2 Replies
509 Views
Hello, is there a recommended way to render Menu items, e.g in a SwiftUI ContextMenu with icon (SFSymbols)? Let's say I have the following setup: Both buttons render fine on native macOS (e.g Sonoma) but Catalyst refuses to render the symbol at all. I tried every possible combination I could think off. The only way I found was to directly copy and paste a symbol from the SF symbols app and inline it with the label string as unicode. Unfortunately I have a couple custom SF symbols so this isn't really an option for me. I feel like this is a perfectly valid usecase, as it makes the menu visually a lot easier scannable. With UIKit and Ventura this at least worked for Menubar items but now also seems broken on Sonoma. I would greatly appreciate any hints. Thanks!
Posted Last updated
.
Post not yet marked as solved
6 Replies
2.2k Views
Hi, is there a way in visionOS to anchor an entity to the POV via RealityKit? I need an entity which is always fixed to the 'camera'. I'm aware that this is discouraged from a design perspective as it can be visually distracting. In my case though I want to use it to attach a fixed collider entity, so that the camera can collide with objects in the scene. Edit: ARView on iOS has a lot of very useful helper properties and functions like cameraTransform (https://developer.apple.com/documentation/realitykit/arview/cameratransform) How would I get this information on visionOS? RealityViews content does not seem offer anything comparable. An example use case would be that I would like to add an entity to the scene at my users eye-level, basically depending on their height. I found https://developer.apple.com/documentation/realitykit/realityrenderer which has an activeCamera property but so far it's unclear to me in which context RealityRenderer is used and how I could access it. Appreciate any hints, thanks!
Posted Last updated
.
Post not yet marked as solved
0 Replies
1k Views
Hello, I've done a couple tests and noticed that when I run an ARKit Session with for example a world- or geo-tracking configuration which has environment texturing set to either .manual or .automatic and I walk around with the device for an extended distance there is a pretty noticeable increase in memory usage. I'm not implying that it's a leak but it seems like the system creates lots and lots of environment probes and does not remove older ones. An example: This is a barebones Xcode RealityKit starter project that spawns a couple cubes with an ARGeoTracking configuration. After ~7mins of walking around (roughly a distance of 100-200 meters) it uses an additional 300MBs of RAM. I've seen cases where users walked around for a while and the app eventually crashed because of this. When environment texturing is disabled RAM usage pretty much stays the same, no matter how far I walk. Is there a recommended way on how to handle this? Should I remove probes manually after a while or eventually disable environment texturing altogether at some point (will that preserve the current cube map)? I would appreciate any guidance. With the Xcode example project you can easily recreate the issue by modifying it just a little and then walk around for a while with statistics enabled. class ViewController: UIViewController {     @IBOutlet var arView: ARView!     override func viewDidLoad() {         super.viewDidLoad()         arView.automaticallyConfigureSession = false     }     override func viewDidAppear(_ animated: Bool) {         super.viewDidAppear(animated)         let config = ARWorldTrackingConfiguration()         config.environmentTexturing = .manual         arView.session.run(config)         // Load the "Box" scene from the "Experience" Reality File         let boxAnchor = try! Experience.loadBox()         // Add the box anchor to the scene         arView.scene.anchors.append(boxAnchor)     } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
657 Views
Hello, I recently converted from manual dictionary-based JSON Serialisation to Codable and noticed that this resulted in a pretty significant growth in binary size (I looked that up via App Thinning Size Report). The difference from codable to non codable is ~800KB. As our app also supports App Clips I now can't fulfill the 10MB Universal Bundle Size limit anymore. Is there any way I can make this a little more lean? Would it help to manually implement all Codable methods instead of relying on compiler synthetization? Thanks for any hints!
Posted Last updated
.
Post not yet marked as solved
0 Replies
844 Views
Hello, after a recent Apple Maps update last week the satellite imagery data now looks a lot worse than before. What previously looked lifelike and lush now looks very sad and wintery. Also the contrast seems way to extreme. Attached is a sample image. FB: FB11716831 Any chance that this could be reverted to the old version?
Posted Last updated
.
Post not yet marked as solved
3 Replies
822 Views
Hi, please let me know if I should rather file feedback for this, but I figured it's worth to flag it one way or an another: Test Xcode Version: 14.0 beta 6 (14A5294g) 1. Project »Altering RealityKit Rendering with Shader Functions« This project crashes right away when running it on a device (iOS 15 and 16). Screenshot: 2. Project »Altering RealityKit Rendering with Shader FunctionsUsing object capture assets in RealityKit« Suffers from pretty bad performance when run on a device – barely scratching 20-25fps on an iPhone 12 Pro. iPhone XS even less. Screenshot: As these are official sample project I feel like they should work flawlessly out of the box. Best Arthur
Posted Last updated
.
Post not yet marked as solved
4 Replies
883 Views
Hello, On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material. My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue. On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public. The radar number including a sample project is: FB11420976 Thank you!
Posted Last updated
.
Post not yet marked as solved
8 Replies
2.9k Views
Hello, in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app. This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work. I've created a little test project and the error message log is not really helpful. 2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene. 2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259' 2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259' 2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 ▿ Failed to load loadRequest.   - generic: "Failed to load loadRequest." Basic code structure that is used for loading: cancellable = Entity.loadAsync(named: entityName, in: .main)     .sink { completion in         switch completion {         case .failure(let error):             dump(error)             print("Done")         case .finished:             print("Finished loading")         }     } receiveValue: { entity in         print("Entity: \(entity)")     } Is there anyway to force it to load in a mode that enforces compatibility? As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14). Thanks for any help!
Posted Last updated
.
Post not yet marked as solved
3 Replies
2.4k Views
Hi, since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it? If I remember correctly I didn't even assign an ARSessionDelegate. Thank you!
Posted Last updated
.
Post not yet marked as solved
0 Replies
730 Views
Hello, I've been looking all over the place but so far I haven't found a trivial way to grab ARView's current timestamp. So basically the elapsed time since the scene started rendering. I can access that in the Surface and Geometry shaders but I would like to pass a timestamp as a parameter in order to drive shader animations. I feel that's more efficient than injecting animation progress manually on every frame, especially if there are lot's of objects with that shader. So far what I've done is subscribing to the Scenes Update event and using the delta time to calculate the elapsed time myself. But this is quite error prone and tends to break when I present the scene a second time (e.g closing and reopening to AR experience). The only other option I found was using a render callback and to grab the time property from the PostProcessContext. That works well but do I really have to go that route? Would be great if there is an easy way to achieve this. Pretty much an equivalent to this: https://developer.apple.com/documentation/scenekit/scnscenerenderer/1522680-scenetime NOTE: I'm not looking for the timestamp of the ARSessions current frame. Thank you!
Posted Last updated
.
Post marked as solved
1 Replies
3.3k Views
I've just implemented some necessary fixes to our app to ensure compatibility with iOS 15 and ARKit 2. Now when I deploy the app via XCode 13 RC (Version 13.0 (13A233)) I get the following crash right at launch: dyld: Library not loaded: /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation   Referenced from: /private/var/containers/Bundle/Application/…   Reason: image not found dyld: launch, loading dependent libraries DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib Is this a known issue? I've already tried deleting Derived Data and clearing the project but the problem persists. The minimum deployment target is iOS 13 for the main app and iOS 14 for the AppClip. All iOS 15 related fixes are wrapped into if #available(iOS 15.0, *) { … } This is a pretty major problem for us as we now can't send out Testflights or upload to AppStoreConnect for Monday. Thanks!
Posted Last updated
.
Post not yet marked as solved
1 Replies
909 Views
Hi, is there a way to force RealityKit to compile/prewarm and cache all shaders that will be used within a Scene in advance – ideally in the background? This would be useful for adding complex models to the scene which sometimes can cause quite a couple dropped frames even on the newest devices (at least I assume the initial delay when displaying them is caused by the shader compilation) but also for CustomMaterials. Note this also happens with models that are loaded asynchronously. Thanks!
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.3k Views
Hello, I have a usecase where I need to to download and compile metal shaders on demand as strings or .metal files. These should then be used for CustomMaterials and/or postprocessing within RealityKit. Essentially this boils down to having raw source code that needs to be compiled at runtime. My plan was to use the method makeLibrary(source:options:completionHandler:) to accomplish this. The problem is that I get the following error during compilation: RealityKitARExperienceAssetProvider: An error occured while trying to compile shader library »testShaderLibrary« - Error Domain=MTLLibraryErrorDomain Code=3 "program_source:2:10: fatal error: 'RealityKit/RealityKit.h' file not found #include <RealityKit/RealityKit.h> My code for creating the library looks like this (simplified example): let librarySourceString: String = """ #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void mySurfaceShader(realitykit::surface_parameters params) {     params.surface().set_base_color(half3(1, 1, 1)); } """ mtlDevice.makeLibrary(source: librarySourceString, options: nil) { library, error in     if let error = error {         dump(error) return     }     // do something with library } So I'm wondering if there's a way to tell the metal compiler how to resolve this reference to the RealityKit header file? Would I need to replace that part of the source string maybe with an absolute path to the RealityKit framework (if so – how would I get this at runtime)? Appreciate any hints - thanks!
Posted Last updated
.
Post not yet marked as solved
5 Replies
1.4k Views
Hello, I have a cylinder with an Unlit Material and a base color set. Now I want to apply a gradient as the alpha mask so I get this kind of halo GTA-like checkpoint look. The code: var baseMaterial = UnlitMaterial(color: UIColor.red) baseMaterial.blending = .transparent(opacity: .init(scale: 100, texture: .init(maskTextureResource))) // maskTextureResource is the gradient mask baseMaterial.opacityThreshold = 0 This works but unfortunately leads to some ugly visible gradient banding. I've also tried to play with the scale of the blending texture but that did not help. As an alternative approach I tried to solve this via a custom surface shader. Code below: [[visible]] void gradientShader(realitykit::surface_parameters params) {     auto surface = params.surface();     float2 uv = params.geometry().uv0();     float h = 0.5; // adjust position of middleColor     half startAlpha = 0.001;     half middleAlpha = 1;     half endAlpha = 0.001;     half alpha = mix(mix(startAlpha, middleAlpha, half(uv.y/h)), mix(middleAlpha, endAlpha, half((uv.y - h)/(1.0 - h))), half(step(h, uv.y))); surface.set_emissive_color(half3(params.material_constants().emissive_color())); surface.set_base_color(half3(params.material_constants().base_color_tint()));     surface.set_opacity(alpha); } The result looks really nice and smooth but unfortunately this now also culls the inner part of the cylinder. Even on the semitransparent parts. What I want is a having the effect applied on both the outer and inner part of the cylinder being visible. So the transparent part of the outside allows you to seethrough to the inside. I've got this working by using a PhysicallyBasedMaterial instead of an UnlitMaterial (which does not support blending out of the box) but again had to issue with the banding. On my Custom Material faceCulling is set to .none. Here is how it currently looks – as you can see in the left one the alpha mask is not smooth and has banding artefacts: Thank you for any help!
Posted Last updated
.