Discuss augmented reality and virtual reality app capabilities.

Posts under AR / VR tag

106 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

WorldTrackingProvider's queryDeviceAnchor is not giving correct deviceAnchor
I'm constructing a RealityView where I'd like to display content in front of user's face. When testing, I found that the deviceAnchor I initially get was wrong, so I implement following code to wait until the deviceAnchor I get from worldTrackingProvider has the correct value: private let arkitSession = ARKitSession() private let worldTrackingProvider = WorldTrackingProvider() var body: some View { RealityView { content, attachments in Task { do { // init worldTrackingProvider try await arkitSession.run([worldTrackingProvider]) // wait until deviceAnchor returns correct info var deviceAnchor : DeviceAnchor? // continuously get deviceAnchor and check until it's valid while (deviceAnchor == nil || !checkDeviceAnchorValid(Transform(matrix: deviceAnchor!.originFromAnchorTransform).translation)) { deviceAnchor = worldTrackingProvider.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) } let cameraTransform = Transform(matrix: deviceAnchor!.originFromAnchorTransform) // ...codes that update my entity's translation } catch { print("Error: \(error)") } } } } private func checkDeviceAnchorValid(_ translation: SIMD3<Float>) -> Bool { // codes that check if the `deviceAnchor` has a valid translation. } However, I found that sometimes I can't get out from the while loop defined above. Not because my rules inside checkDeviceAnchorValid func are too strict, but because the translation I get from deviceAnchor is always invalid(it is [0,0,0] and never changed) Why is this happening? Is this a known issue? I wonder if I can get recalled when the worldTrackingProvider returns the correct deviceAnchor,
2
0
577
Jan ’24
CFD simulation on Vision pro
Hi, i am required to upload my CFD simulation results to the new vision pro glasses. This simulation shall be visible as a soft VR/AR object in the room. I am very new to the developer world. Could someone give me a hint which IDE, tool etc. to use for this task? SwiftUI, swift, visionOS, Xcode, ... ???? After I know what IDE/tool/language to use, I will start learning courses with it. Thanks a lot!!
2
0
388
Jan ’24
Unreal Engine 4.27 VR Game on VisionOS
Hello everyone! I'm completely new to Apple. So my Situation is the following: In the company I work we have a Virtual Reality Educational Game made in Unreal Engine 4.27. Now with the release of the Apple Vision Pro we plan on making our game available on VisionOS, but like I said, we're kinda new to the Apple Enviroment and I already encountered some Problems with Building Unreal Engine 4.27 on XCode 15.2, which is a requirement for VisionOS if I understand correctly. So what I wanted to ask is, if anyone has some experiences with porting an Unreal Engine Game into VisionOS already, what's the best guideline to accomplish that everything works out correctly. Maybe some access to a Tutorial or Guide, etc. Our progress is the following: I'm using a MacBook Pro with Sonoma 14.2.1 with Xcode 15.2. I was following this Guide to setup Unreal Engine on the Mac: https://docs.unrealengine.com/5.3/en-US/downloading-unreal-engine-source-code/ The first problem I encountered was some weird Errors when building the ShaderCompileWorker. Errors like "variable 'x' set but not used [-Werror,-Wunused-but-set-variable]" and "use of bitwise '&' with boolean operands [-Werror,-Wbitwise-instead-of-logical] I know the reason why those errors are happening but I didn't want to reassamble the whole Unreal Engine Code so I was looking for a general solution but the only thing I found was People reverting to XCode 13.4.1, which is not possible when I want to use VisionOS I believe. So now I'm thinking if a Unreal Engine 4.27 Game Port to VisionOS is a reasonable thing to do, or if it's just hardly possible to do. I would like to have some more insight about that topic, before putting a lot of work and resources into that task, only to realize that it maybe won't work at the end. I'd appreciate any kind of advice or help on that topic, just to have a better view on the whole issue - like I said, we're new to Apple :) Thanks a lot in advance!
4
0
1.3k
Jan ’24
Place Entity in the Middle of a Table in VisionOS
Hello Guys, I am currently stuck on understanding how I can place a 3D Entity from a USDZ file or a Reality Composer Pro project in the middle of a table in a mixed ImmersiveSpace. When I use the AnchorEntity(.plane(.horizontal, classification: .table, minimumBounds: SIMD2<Float>(0.2, 0.2))) it just places it somewhere on the table and not in the middle and not in the orientation of the table so the edges are not aligned. Has anybody got a clue on how to to this? I would be very thankful for a response, Thanks
0
2
388
Jan ’24
After adding gestures to the EntityModel inside ARView, there seems to be a memory leak when attempting to remove the EntityModel.
After adding gestures to the EntityModel, when it is necessary to remove the EntityModel, if the method uiView.gestureRecognizers?.removeAll() is not executed, the instance in memory cannot be cleared. However, executing this method affects gestures for other EntityModels in the ARView. Does anyone have a better method to achieve this? Example Code: struct ContentView : View { @State private var isRemoveEntityModel = false var body: some View { ZStack(alignment: .bottom) { ARViewContainer(isRemoveEntityModel: $isRemoveEntityModel).edgesIgnoringSafeArea(.all) Button { isRemoveEntityModel = true } label: { Image(systemName: "trash") .font(.system(size: 35)) .foregroundStyle(.orange) } } } } ARViewContainer: struct ARViewContainer: UIViewRepresentable { @Binding var isRemoveEntityModel: Bool let arView = ARView(frame: .zero) func makeUIView(context: Context) -> ARView { let model = CustomEntityModel() model.transform.translation.y = 0.05 model.generateCollisionShapes(recursive: true) __**arView.installGestures(.all, for: model)**__ // here--> After executing this line of code, it allows the deletion of a custom EntityModel in ARView.scene, but the deinit {} method of the custom EntityModel is not executed. arView.installGestures(.all, for: model) let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2))) anchor.children.append(model) arView.scene.anchors.append(anchor) return arView } func updateUIView(_ uiView: ARView, context: Context) { if isRemoveEntityModel { let customEntityModel = uiView.scene.findEntity(named: "Box_EntityModel") // --->After executing this line of code, ARView.scene can correctly delete the CustomEntityModel, and the deinit {} method of CustomEntityModel can also be executed properly. However, other CustomEntityModels in ARView.scene lose their Gestures as well. __** uiView.gestureRecognizers?.removeAll()**__ customEntityModel?.removeFromParent() } } } CustomEntityModel: class CustomEntityModel: Entity, HasModel, HasAnchoring, HasCollision { required init() { super.init() let mesh = MeshResource.generateBox(size: 0.1) let material = SimpleMaterial(color: .gray, isMetallic: true) self.model = ModelComponent(mesh: mesh, materials: [material]) self.name = "Box_EntityModel" } deinit { **print("CustomEntityModel_remove")** } }
0
0
380
Jan ’24
Did RealityView syntax change?
All of a sudden (like when XCode 15.2 left beta yesterday?) I can't build attachments into my RealityView: var body: some View { RealityView { content, attachments in // stuff } attachments: { // stuff } Produces "No exact matches in call to initializer" on the declaration line (RealityView { content, attachments in). So far as I can tell, this is identical to the sample code provided at the WWDC session, but I've been fussing with various syntaxes for an hour now and I can't figure out what the heck it wants.
2
0
556
Jan ’24
ARView.debugOptions = .showStatistics Error: 5775
struct ARViewContainer: UIViewRepresentable { func makeUIView(context: Context) -> ARView { let arView = ARView(frame: .zero) arView.debugOptions = .showStatistics // Error: return arView } func updateUIView(_ uiView: ARView, context: Context) {} } -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5775: failed assertion `Draw Errors Validation Vertex Function(vsSdfFont): the offset into the buffer viewConstants that is bound at buffer index 4 must be a multiple of 256 but was set to 61840. '
1
0
441
Jan ’24
Multiple root level objects Error [USDZ/Reality Composer Pro]
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter. See Attached image: It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear. Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
1
0
825
Dec ’23
RealityView Attachments
I have a RealityView and I want to add an Entity with an Attachment. Assuming I have a viewModel manage my entities, and the addEntityGesture() will add a new Entity under the rootEntity. RealityView { content, attachments in // Load initial content content.add(viewModel.rootEntity) } update: { updateContent, updateAttachments in // } attachments: { // } .gesture(addEntityGesture()) I know that we can create attachment in the attachments closure, and add those attachments as entities in our make closure, however, what if I want to add entity with an attachment on the fly?
0
0
556
Dec ’23
RealityView fit in volumetric window
Hey guys How I can fit RealityView content inside a volumetric window? I have below simple example: WindowGroup(id: "preview") { RealityView { content in if let entity = try? await Entity(named: "name") { content.add(entity) entity.setPosition(.zero, relativeTo: entity.parent) } } } .defaultSize(width: 0.6, height: 0.6, depth: 0.6, in: .meters) .windowStyle(.volumetric) I understand that we can resize a Model3D view automatically using .resizable() and .scaledToFit() after the model loaded. Can we achieve the same result using a RealityView? Cheers
1
1
626
Dec ’23
Experience.rcproject
Hello I am very new here in the forum (in iOS dev as well). I am trying to build an app that uses 3d face filters and I want to use Reality Composer. I knew Xcode 15 did not have it so I downloaded the beta 8 version (as suggested in another post). This one actually has Reality Composure Pro (XCode -> Developer tools -> Reality Composure Pro) but the Experience.rcproject still does not appear. Is there a way to create one? When I use Reality Composure it seems only able to create standalone projects and it does not seem to be bundled in a any way to xCode. Thanks for your time people!
2
0
563
Dec ’23
Fixed background in visonOS
I successfully changed a picture to the background in ImmersiveSpace in a full state with the following code. import RealityKit struct MainBackground: View { var body: some View { RealityView { content in guard let resource = try? await TextureResource(named: "Image_Name") else { fatalError("Error.") } var material = UnlitMaterial() material.color = .init(texture: .init(resource)) let entity = Entity() entity.components.set(ModelComponent( mesh: .generateSphere(radius: 1000), materials: [material] )) entity.scale *= .init(x: -1, y: 1, z: 1) content.add(entity) } } } However, when running, I found that when the user moves, the background is not fixed, but follows the user's movement, which I feel unrealistic. How to fix the background in the place where it first appears, and give the user a kind of movement that is really like walking in the real world, instead of letting the background follow the user.
1
0
454
Dec ’23
USD particles not supported in RealityKit with iOS (*not VisionOS*)
RealityKit doesn't appear to support particles. After exporting particles from Blender 4.0.1, in standard .usdz format, the particle system renders correctly in Finder and Reality Converter, but when loaded into and anchored in RealityKit...nothing happens. This appears to be a bug in RealityKit. I tried one or more particle instances and nothing renders.
0
0
516
Nov ’23
When "zooming" in on the camera, the "joint information" comes out wrong.
Hello. I am a student who just studied Swift. It has succeeded in obtaining body joint information through 'skelton.jointLandmarks' from a typical screen, but every time I zoom in, there is a problem that this joint information is not on the human body and moves sideways and downward. From my guess, there may be a problem that the center of the AR screen is not located in the center of the cell phone screen. I've been searching for information for 3 days due to this problem, but I couldn't find a similar case, and I haven't been able to solve it. If there is a case of solving a similar problem, I would appreciate it if you could let me know. Below link is how I zoomed in on the ARView screen. https://stackoverflow.com/questions/64896064/can-i-use-zoom-with-arview Thank you. Below is how I'm currently having trouble.
0
0
371
Nov ’23
visionOS's "offset3D"
In visionOS, have many code with 3D attributes (SwiftUI) is adapted from the code in iOS, like: .padding(_:) to .padding3D(_:). In iOS have .offset(x:_, y:_), it only have X and Y, but in visionOS, view is in a 3D scene, so I want offset have Z, but offset can't use Z, so I try offset3D: import SwiftUI //Some View .offset3D(x: Number, y: Number, z: Number) Xcode report an error: Error: No member 'offset3D' So do you now how to use like offset's Modifiers, and can use Z in visionOS.
1
0
382
Nov ’23
VisionOS PortalComponent issue
Hi! Im having an issue creating a PortalComponent on visionOS Im trying to anchor a Portal to a wall or floor anchor and always the portal appears opposite to the anchor. If I use a vertical anchor (wall) the portal appears horizontal on the scene If I use a horizontal anchor (floor) the portal appears vertical on the scene Im tested on xcode 15.1.0 beta 3 15.1.0 beta 2 15.0 beta 8 Any ideas ?? Thank you so much!
0
0
353
Nov ’23
Object Capture: Exporting model as OBJ.
https://developer.apple.com/documentation/realitykit/photogrammetrysession/request/modelfile(url:detail:geometry:) if the path given is a file path that contains a .usdz extension, then it will be saved as .usdz, or else if we provide a folder, it will save as OBJ; I tried it, but no use. Right before saving, it shows the folder that will be saved, but after I click on done and check the folder, it's always empty.
1
1
651
Nov ’23
FPS drop while updating ModelEntity position
Hi there. I have a performance issue when updating ModelEntity's position. There are two models with the same parent: arView.scene.add(anchorEntity) anchorEntity.addChild(firstModel) anchorEntity.addChild(secondModel) The firstModel is very large model. I am taking position of second model and applying it to the first: func session(_ session: ARSession, didUpdate frame: ARFrame) { // ... // Here the FPS drops firstModel.position = secondModel.position // ... } In other 3D Engines changing the transform matrix do not affects the performance. You can change it like hundred times in a single frame. It's only renders the last value on next frame. It means that the changing position itself should not cause FPS drop. If it's low, it will be always low, because there is always a value in transform matrix, and the renderer always renders what stored there. If you change the value, the next frame will basically be rendered with the new value, nothing heavy will not be happen. But in my case the FPS drops only if the model's position got changed. If it's not, the FPS is 60. So the changing transform matrix caused FPS drop. Can anyone describe why the RealityKit's renderer works in that way?
0
0
326
Nov ’23