Posts

Post marked as solved
2 Replies
I made the code as simple as possible: a single 5cm target square placed at location (0, 0, -1.5) I then tapped on target while standing in several locations, and the reported tap location was off along the Z axis by about 0.5m. Here were several tap locations (Z location is underlined): tap location: SIMD3(0.0067811073, 0.019996116, -1.1157947), name: target tap location: SIMD3(-0.00097223074, 0.019996116, -1.1036792), name: target tap location: SIMD3(0.0008024718, 0.019995179, -1.1074299), name: target tap location: SIMD3(-0.009804221, 0.019996116, -1.0694565), name: target tap location: SIMD3(-0.0037206858, 0.019995492, -1.0778457), name: target tap location: SIMD3(-0.009298846, 0.019996116, -1.0772702), name: target Here is the code to set up the RealityView: import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { @StateObject var model = MyModel() /// Spatial tap gesture that tells the model the tap location. var myTapGesture: some Gesture { SpatialTapGesture() .targetedToAnyEntity() .onEnded { event in let location3D = event.convert(event.location3D, from: .global, to: .scene) let entity = event.entity model.handleTap(location: location3D, entity: entity) } } var body: some View { RealityView { content in model.setupContentEntity(content: content) } .gesture(myTapGesture) } } Here is the model code: import Foundation import SwiftUI import RealityKit import RealityKitContent import ARKit import os.log @MainActor class MyModel: ObservableObject { private var realityViewContent: RealityViewContent? /// Capture RealityViewContent and create target /// /// - Parameter content: container for all RealityView content func setupContentEntity(content: RealityViewContent) { self.realityViewContent = content placeTargetObject() } /// Place a small red target at position 0, 0, -1.5 /// /// I will look at this position and tap my fingers. The tap location /// should be near the same position (0, 0, -1.5) func placeTargetObject() { guard let realityViewContent else { return } let width: Float = 0.05 let height: Float = 0.02 let x: Float = 0 let y: Float = 0 let z: Float = -1.5 // Create red target square let material = SimpleMaterial(color: .red, isMetallic: false) let mesh = MeshResource.generateBox(width: width, height: height, depth: width) let target = ModelEntity(mesh: mesh, materials: [material]) // Add collision and target component to make it tappable let shapeBox = ShapeResource.generateBox(width: width, height: height, depth: width) let collision = CollisionComponent(shapes: [shapeBox], isStatic: true) target.collision = collision target.components.set(InputTargetComponent()) // Set name, position, and add it to scene target.name = "target" target.setPosition(SIMD3<Float>(x,y + height/2, z), relativeTo: nil) realityViewContent.add(target) } /// Respond to the user tapping on an object by printing name of entity and tap location /// /// - Parameters: /// - location: location of tap gesture /// - entity: entity that was tapped func handleTap(location: SIMD3<Float>, entity: Entity) { os_log("tap location: \(location), name: \(entity.name, privacy: .public)") } } Example of the small red target:
Post marked as solved
2 Replies
I made some progress. When creating the SceneReconstructionProvider, specify the classifications mode. let sceneReconstruction = SceneReconstructionProvider(modes: [.classification]) Then the MeshAnchor.Geometry's classification property is set, and here are some example values count: 11342 description: GeometrySource(count: 11342, format: MTLVertexFormat(rawValue: 45)) format: 45 componentsPerVector: 1 offset: 0 stride: 1 So I am guessing the buffer contains a sequence values that map to the MeshAnchor.MeshClassification raw values. (Now I just need to figure out which MTLVertexFormat case has a raw value of 45 :-) Edit: uchar is type 45. So, the buffer contains a sequence of unsigned bytes.
Post marked as solved
2 Replies
Oops. It helps if I read the details. I'm rolling back to Beta 8
Post marked as solved
2 Replies
Note: Xcode > Settings > Platforms shows is there visionOS simulator is there. I'm going to delete all my Xcodes and try a fresh install.
Post marked as solved
1 Replies
I have found a workaround, but I don't know if this a good design or not. In "Deployment Info" in Xcode, I set iPhone/iPad Orientation to Portrait only. Then when I rotated the device to the side, the cube doesn't change sizes. (Note: I am getting the device's eulerAngles and converting them to a quaternion and applying that to the PerspectiveCamera)
Post not yet marked as solved
4 Replies
SharePlay with visionOS appears to hide location data of other people from the app. For example, data about the other personas (outside of their immersion status) is not exposed via APIs. I am guessing this is for privacy reasons (?). I am not sure how Apple handles (or will handle) people in the same physical room. So far, I haven't seen any examples of (or WWDC videos) covering this. I look forward to some clarification and examples. One possible workaround for people in the same physical room is to anchor the virtual content to an image. Print that image on a piece of paper and place it on the floor or a table. The two participants should see the same virtual content in the same location and same orientation because it is tied to something physical (the printed paper).
Post not yet marked as solved
2 Replies
Just bumping this up in part because I ran into this problem again, but this time when I was trying to find the underlying problem with <> issue in a SwiftUI coordinator. I started a new project today beginning with "Augmented Reality App" template, and now I get this ARKit error with iOS 16.4 but not iOS 16.2 (or any other iOS minimum deployment). Sometimes I feel like Apple is gaslighting me. :-) This issue also prevents Xcode from knowing a variable's type, showing <> (see screenshot below), which causes additional problems. (Again, all problems go away when I target a different minimum deployment target) Values for "Minimum Deployments" value where I get the error and where I do not. iOS 16.4 - error iOS 16.3 - NO error iOS 16.2 - NO error I can replicate the error by creating a new iOS project with the template "Augmented Reality App", and then simply add "import ARKit" to the group of imports. In this example with minimum deployment set to iOS 16.4, Xcode doesn't know what type foo is. It shows <> But when I change the minimum deployment type to iOS 16.3, Xcode knows foo is of type PerspectiveCamera.
Post not yet marked as solved
1 Replies
I have yet to do this (I am working towards it), but there are a number of videos on YouTube showing how to rig Blender models to work with Apple's ARKit (on YouTube's site, search for "iPhone blendshapes" or "ARKit blendshapes"). I've also seen people willing to rig models for a nominal fee to support Apple's blendshapes. I think Apple's documentation on ARFaceAnchor is good starting point. I've also played a bit with Apple's Tracking and Visualizing Faces sample app. The code is a little out of date, but is still a good place to start.
Post not yet marked as solved
2 Replies
Replied In Apple AR
My personal recommendation (as a regular developer; not someone from Apple) would be to start with Apple's WWDC sessions on Spatial Computing. The iPhone/iPad AR experiences will probably need a different user experience from Apple Vision Pro AR experiences. For example, on iPhone and iPad, because the user's hands are holding the device, they can't do much in the way of interacting with content. Also, Apple recommends AR experiences for iPhone & iPad only last for 1-2 minutes at a time for various reasons. See Apple's 2022 WWDC session Qualities of great AR experiences. After that, I recommend starting from Apple's oldest sessions (2019, at the bottom of that web page) and working forward in time. Finally, while Apple Vision Pro is the coolest platform for AR (I desperately want a dev kit), don't ignore the approximately 1 billion people with iPhones and iPads who could run your AR/VR applications if you target those platforms.
Post not yet marked as solved
1 Replies
Can Apple's Reality Converter convert the .stl file to .usdz? (I don't think the old version could) On occasion, I've imported models into Blender, exported them as GLTF files, and then used Reality Converter to convert them to USDZ. One issue I've run into in this process is the unit size. For example, in Reality Converter, I will change the unit size from meter to centimeter and then back to meter and then export from Reality Converter. For some reason, this solves the problem of a model appearing at 1/100 the size you expect.
Post marked as solved
2 Replies
I believe this is the expected behavior and consistent with RealityKit on the iPhone when attaching an anchor to a vertical surface. Attached is an old RealityKit test I did, where I anchored the object to the wall. In this case, the green cube is along the +X axis, the red cube is the +Y axis, and the blue cube is the +Z axis. If I recall correctly, when I attached things to the ceiling, the +Y axis pointed down. In general, I believe the +Y axis is the normal from the surface.
Post not yet marked as solved
3 Replies
In a previous project I needed to create separate USDZ files for each animation - one animation per USDZ file. I loaded the first USDZ model and kept both the model and animation. For additional animations I loaded the USDZ file, kept the animation, and dumped the model part. I could then apply the animations to the full model I loaded first.
Post not yet marked as solved
2 Replies
I figured out one way to do it from Apple's source code. I posted the solution here: https://developer.apple.com/forums/thread/724014
Post marked as solved
1 Replies
Figured it out! I needed to plug-in my phone and build for the actual device (as opposed to the simulator) Building for Any iOS Device (arm64) works too.
Post marked as solved
7 Replies
Success! In another thread someone mentioned installing older versions of Autodesk software, and that did the trick for me too. (Note: This is still with Monterey 12.2 not the 12.3 beta where I initially had problems) Here is the previous thread where I found the suggestion to try older versions of Autodesk software. https://developer.apple.com/forums/thread/661927