Post

Replies

Boosts

Views

Activity

Reduce Memory footprint of RealityKit App
Im noticing about 450MB of memory footprint when loading a simple 2MB USDZ model. To eliminate any mis-use of the frameworks on my part, I built a basic RealityKit app using Xcode's Augmented Reality App, with no code changes at all. Im still seeing 450MB in Xcode gauges(so in debug mode) When looking at memgraph, Im seeing IOAccelerator and IOSurface regions have 194MB and 131MB of dirty memory respectively. Is this all camera-related memory? In the hopes of reducing compute & memory, I tried disabling various rendering options on ARView as follows: arView.renderOptions = [             .disableHDR,             .disableDepthOfField,             .disableMotionBlur,             .disableFaceMesh,             .disablePersonOcclusion,             .disableCameraGrain,             .disableAREnvironmentLighting         ] This brought it down to 300MB which is still quite a lot. When I configure ARView.cameraMode to be nonAR its still 113MB Im running this on iPhone 13 Pro Max which could explain some of the large allocations, but would still like to see opportunities to reduce the foot print. When I use QLPreviewController same model (~2MB) takes only 27MB in Xcode-gauges. Any ideas on reducing this memory footprint while using ARKit
6
2
2.1k
May ’22
Registering library that already exists in shader manager
Im trying to apply Metal Surface Shader to my ModelEntity's CustomMaterial. While everything seems to work as expected and Im getting the desired result, Im seeing following error/warning continuously being spewed in the console: [AssetTypes] Registering library (memorylib_10793780016) that already exists in shader manager. Library will be overwritten. Any ideas what Im doing wrong?
1
0
1k
Mar ’22
Prevent Virtual Objects translating into Physical Walls in response to TranslationGesture
Hi, After placing a virtual object(chair) from a USDZ onto a horizontal Plane, I setup gestures something like: modelEntity.generateCollisionShapes(recursive: true) arView?.installGestures([.translation, .rotation], for: modelEntity) Translation and Rotation seem to work fine, but when I drag the virtual object towards a wall, it seems to "go into" the wall instead of stopping at the wall. Is there a way to achieve this in RealityKit? Do I need to implement my own GestureRecognizers and perform raycast on each PanGesture callback? Thank you
0
0
454
Jan ’22