Post

Replies

Boosts

Views

Activity

Reply to Custom 3D Window Using RealityView
Hi thanks for the reply; I think i wasn’t clear about what’s going on. I have a window with a RealityView in it. Currently that RealityView presents a Reality Composer scene. When I look at that window in the compiled app, the contents sit physically in front of the actual window, and moving them back in the scene has no effect at all. Since posting this, I have experimented with doing a findEntity in the scene and pulling out a Transform which parents a ModelEntity. Doing that allows me to manipulate the depth of the ModelEntity relative to that Transform. But it is surprising that I can’t do the same thing with the scene itself; I have to extract scene elements to adjust their depth.
2w
Reply to SwiftUI and @FetchRequest - modify predicate or sort dynamically?
Extending what has already been suggested, you can modify the predicate of the wrappedValue to dynamically trigger an update to a SwiftUI list. In the example below, toggling the isFiltered @State causes an update of the List and it reloads the FetchedResults with the filtered or unfiltered predicate. @FetchRequest private var items: FetchedResults<Item> @State var isFiltered = false let filteredPredicate: NSPredicate let unfilteredPredicate: NSPredicate init(element: Element) { self.element = element      filteredPredicate = NSPredicate(format: "element == %@ && score > 0.85", element) unfilteredPredicate = NSPredicate(format: "element == %@", element)      self._items = FetchRequest<Item>(entity: Item.entity(), sortDescriptors: [NSSortDescriptor(keyPath: \Item.name, ascending: true),           predicate: unfilteredPredicate,           animation: .default) } var listItems: FetchedResults<Moment> {      get {         _moments.wrappedValue.nsPredicate = isFiltered ? filteredPredicate : unfilteredPredicate           return moments      } } var body: some View {     List {        ForEach(Array(listItems.enumerated()), id: \.element) { index, item in Text(item.name) .toolbar {                ToolbarItem {                    Button {                        withAnimation {                           isFiltered.toggle()                        }                   } label: {                        Label("Filter Items", systemImage: isFiltered ? "star.circle.fill" : "star.circle")                   }                }           } } } }
Jun ’22
Reply to ARKit 5 Motion Capture Enabled?
I cannot speak to your difficulties. I have a pre-existing app that successfully uses the motion capture with iOS 13 and 14. In my test I ran the same app simultaneously on a device running iOS 15 beta and another running iOS 14.6. I recorded a video of each and compared them. Motion capture works on 13.5, 14, and 15, but the new precision of ARKit 5 is supposed to be limited to A14 chips
Jul ’21
Reply to iOS 14 change in performance cost of SCNNode creation
I ended up removing my SCNNodes instantiations finding another way to handle my situation, but I have also found that I have problems modifying a node's transform while a UIKit animation is running. It seems to cause a one-second delay in my fps every second or so, even when the node in question is not visible. I tried out the SceneKit profiling as you suggested, but at least with my current setup I am not seeing any compile events, and no clear other culprits in the event durations.
Oct ’20
Reply to ARKit Behaviors Notification
When working with RealityKit, your scene generates code with properties and methods specific to your scene. If you have a RealityComposer file called MyTestApp and a scene called MyScene, and then create a notification trigger with an identifier of HideObject, then the generated code will create a notification accessible from your scene object in your app. So for example: MyTestApp.loadMySceneAsync { result in switch result { case .success(let scene): scene.notifications.hideObject.post() default: break } }
Oct ’20
Reply to Height above ground
I store the y position of ARPlaneAnchors as they come in from session(_ session:didUpdate:) and then I create an ARAnchor for my ground plane with a matrix using that y-position. If my ground plane changes, I remove that ARAnchor and replace it with a new one. Then when placing elements on the ground, I use that ARAnchor as a reference (in my case I've added a RealityKit AnchorEntity anchored to that ground ARAnchor).
Oct ’20
Reply to App suddenly runs out of memory while using ARKit, CoreMotion, and SceneKit
The standard first attack on retain cycles is making sure you've got weak references for any delegates. The next line of attack is ensuring you're using [weak self] for blocks on other threads, like in NSNotification blocks or sink blocks if you use Combine. I was having lots of retain cycle difficulties with my ARKit / SceneKit app, and addressing all of those items took care of the problem. Good luck!
Oct ’20