Post

Replies

Boosts

Views

Activity

ARKit Remote Rendering Examples
Hi, I am developing an AR application, using an ARSession as camera source and a Metal view as rendering target on iPhone (12 Pro Max as test device and on Xcode 13.3 (13E113)). Since I am trying to render a 3D model with high polygon count, I want to set up a remote server for handling actual render to reduce load on iPhone. Are there any examples out there with this approach for ARKit? I am looking for something like Microsoft’s remoting method (which can be found here: https://docs.microsoft.com/en-us/windows/mixed-reality/develop/native/holographic-remoting-player). Any example similar to this, utilizing (if allowed by API) re-projection like HoloLens 2 for iOS would be great.
0
0
747
Apr ’22
Can not remove ARPlaneAnchors properly [iOS 15.* & 16.*]
I am developing an application where I have an active ARSession. This session has the ARWorldTrackingConfiguration active, in which I am setting planeDetection to one of the modes, or both as [.horizontal, .vertical]. This is done via a button press and a dropdown menu. Dropdown menu chooses between horizontal, vertical or everything and button toggles the plane tracking entirely (sets planeTracking = []) When mode change or toggle happens, I want to print a list of anchors correctly to console because I will use the list of anchors later on. When a mode change occurs, I am performing the steps down below inside a switch: switch(planeTrackingMode) { case .horizontal: configuration.planeDetection = [.horizontal] for anchor in (planeAnchors ?? []) { if((anchor as? ARPlaneAnchor)?.alignment == .vertical) { session?.remove(anchor: anchor) } } case .vertical: configuration.planeDetection = [.vertical] for anchor in (planeAnchors ?? []) { if((anchor as? ARPlaneAnchor)?.alignment == .horizontal) { session?.remove(anchor: anchor) } } case .horizontalAndVertical: configuration.planeDetection = [.horizontal, .vertical] default: configuration.planeDetection = [] for anchor in (planeAnchors ?? []) { if(anchor is ARPlaneAnchor) { session?.remove(anchor: anchor) } } } session?.run(config, options: []) Button toggle just performs the same steps as default case. The issue here is that after disabling or removing vertical anchors, previous anchors are still in the memory with same UUIDs. Even if I disable via button, turn the phone in a way that the camera can't see the vertical plane, and then turn the tracking back on the vertical plane is still there. When I put a log after deletion, list of anchors seems fine. But when I put a print in session(_ session: ARSession, didUpdate anchors: [ARAnchor]) delegate function, I see that none of the previous vertical plane anchors were deleted, and some of them start spinning and moving away from their original position slowly, then their positions turn into NAN values. I tried getting the existing anchors list both with currentFrame.anchors and session.getCurrentWorldMap function. Results are the same. There is no ARSceneView or anything, just an ARSession, its camera and a Metal view to render things. Even if I go for session?.run(config, options: [.removeExistingAnchors]), which should remove all anchors, this behaviour is same. If ARSession is doing some cache magic, it seems like it is doing it wrong. If anyone can replicate this, please inform me if this is a bug or not. Behaviour is the same on iPhone 12 Pro Max and iPad Pro 6th Gen, with both iOS 15.* and 16.* versions.
0
0
788
May ’23
Cursor data on VisionOS
I am working on an application where we are planning to use Metal for directly rendering custom content. When user looks at something on the rendered image, I want to get the position or ray of cursor (the point where the user is currently looking at) to render something else like a crosshair. Is it possible to get the cursor position information on VisionOS to accomplish this? How can I know if something is being hovered on by the eyes?
4
1
1k
Mar ’24