Post

Replies

Boosts

Views

Activity

How to present an alert in visionOS immersive space?
My visionOS app uses an immersive view. If the app encounters an error, I want to present an alert. I tried in a demo app to present such an alert, but it is not shown. Nearly the same code on iOS presents an alert window. Here is my demo code, based on Apple's Immersive Environment App template: import SwiftUI import RealityKit import RealityKitContent struct ErrorInfo: LocalizedError, Equatable { var errorDescription: String? var failureReason: String? } struct ImmersiveView: View { @State private var presentAlert = false let error = ErrorInfo( errorDescription: "My error", failureReason: "No reason" ) var body: some View { RealityView { content, attachments in let mesh = MeshResource.generateBox(width: 1.0, height: 0.05, depth: 1.0) var material = UnlitMaterial() material.color.tint = .red let boardEntity = ModelEntity(mesh: mesh, materials: [material]) boardEntity.transform.translation = [0, 0, -3] content.add(boardEntity) } update: { content, attachments in // … } attachments: { // … } .onAppear { presentAlert = true } .alert( isPresented: $presentAlert, error: error, actions: { error in }, message: { error in Text(error.failureReason!) }) } } Since I cannot see any alert, is something wrong with my code? How should an alert be presented in immersive space?
1
0
274
Nov ’24
How to convert in iOS a RealityKit SpatialTapGesture value to an Entity coordinate?
I have an app with a visionOS target, and I want to add an iOS target. Both are based on RealityKit. I want to use a SpatialTapGesture to get the tap coordinate local to the entity tapped. In visionOS this is easy: SpatialTapGesture(coordinateSpace: .local) .targetedToAnyEntity() .onEnded { tap in let entity = tap.entity let localPoint3D = tap.convert(tap.location3D, from: .local, to: entity) // … } However, according to the docs, the convert function seems to exist only in visionOS, not in iOS. So how can I do this conversion in iOS? PS: This was already posted on StackOverflow without success. There, I tried to find a workaround, but I failed.
8
0
351
Oct ’24
visionOS RealityKit's physics simulation stops for certain entity.scale values
The setup:
 My Vision Pro app loads uszd models created by a 3rd party app. These models have to be scaled to the right dimension for RealityKit. I want to use physics for realistic movement of these entities, but this does not work as expected. I thus wrote a demo test app based on Apple's immersive space app template (code below).
This app has two entities, a board and above a box that is a child of the board.
 Both have a collision component and a physics body.
 The board, and thus also its child the box, are scaled. The problem:
 If the scale factor is greater or equal 0.91, the box falls under gravity towards the board where it is stopped after some movements. This looks realistic.
 However, if the scale factor is below 0.91, even 0.9, the movement of the box is immediately stopped on the board without any physics simulation.
 Unfortunately, I am not able to upload screen recordings, but if the demo app is executed, one sees the effect immediately. The question:
 I cannot imagine that this behavior is correct. Can somebody confirm that this is a bug? If so I will write a bug report.
 My demo app uses Xcode Version 16.0 (16A242d), simulator Version 16.0 (1038) and visionOS 2.0. The code: import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { @Environment(AppModel.self) var appModel var body: some View { RealityView { content in let boardEntity = makeBoard() content.add(boardEntity) let boxEntity = makeBox() boardEntity.addChild(boxEntity) moveBox(boxEntity, parentEntity: boardEntity) } } func makeBoard() -> ModelEntity { let mesh = MeshResource.generateBox(width: 1.0, height: 0.05, depth: 1.0) var material = UnlitMaterial(); material.color.tint = .red let boardEntity = ModelEntity(mesh: mesh, materials: [material]) let scale: Float = 0.91 // Physics does not run if scale < 0.91 boardEntity.scale = [scale, scale, scale] boardEntity.transform.translation = [0, 0, -3] boardEntity.generateCollisionShapes(recursive: false) boardEntity.physicsBody = PhysicsBodyComponent(massProperties: .default, material: PhysicsMaterialResource.generate(friction: .infinity, restitution: 0.8), mode: .static) return boardEntity } func makeBox() -> ModelEntity { let mesh = MeshResource.generateBox(size: 0.2) var material = UnlitMaterial(); material.color.tint = .green let boxEntity = ModelEntity(mesh: mesh, materials: [material]) boxEntity.generateCollisionShapes(recursive: false) boxEntity.physicsBody = PhysicsBodyComponent(massProperties: .default, material: PhysicsMaterialResource.generate(friction: .infinity, restitution: 0.8), mode: .dynamic) return boxEntity } func moveBox(_ boxEntity:Entity, parentEntity: Entity) { // Set position and orientation of the box let translation = SIMD3<Float>(0, 0.5, 0) // Turn the box by 45 degrees around the y axis let rotationY = simd_quatf(angle: Float(45.0 * .pi/180.0), axis: SIMD3(x: 0, y: 1, z: 0)) let transform = Transform(rotation: rotationY, translation: translation) boxEntity.transform = transform } }
2
0
252
Sep ’24
How to convert a Point3D value obtained by a SpatialTapGesture to an Entity coordinate?
I have a visionOS app that displays a ModelEntity in a RealityView. This entity can be tapped by a SpatialTapGesture, and this gesture calls .onEnded { event in let point3D = event.location3D // … } I am unable to convert point3D to the local coordinate of the entity. I wrote a little test project to investigate the situation (below). The RealityView shows a box, I can tap the visible faces, and I get point3D values that don't make much sense to me. These values are presented here. So, the main question is: How can I get the coordinate of the point tapped on the shown entity? I used for SpatialTapGesture all 3 options for the coordinateSpace, .local, .global, and .immersive, without success. Here is my code: struct ImmersiveView: View { var body: some View { RealityView { content in let mesh = MeshResource.generateBox(width: 1, height: 0.5, depth: 0.25, splitFaces: true) var frontMaterial = UnlitMaterial() frontMaterial.color.tint = .green var topMaterial = UnlitMaterial() topMaterial.color.tint = .red let boxEntity = ModelEntity(mesh: mesh, materials: [frontMaterial, topMaterial]) boxEntity.components.set(InputTargetComponent(allowedInputTypes: .all)) boxEntity.components[CollisionComponent.self] = CollisionComponent(shapes: [ShapeResource.generateConvex(from: mesh)]) boxEntity.transform.translation = [0, 0, -3] content.add(boxEntity) } .gesture(tapGesture) } var tapGesture: some Gesture { SpatialTapGesture(coordinateSpace: .local) .targetedToAnyEntity() .onEnded { event in let point3D = event.location3D print(point3D) } } }
3
0
417
Aug ’24