Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game using ARKit.

ARKit Documentation

Post

Replies

Boosts

Views

Activity

VisionOS 2 - Passthrough in screen capture
Hello, I am trying to develop an app that broadcasts what the user sees via Apple Vision Pro. I have applied for and obtained the Enterprise API and actually can stream via the "Main camera access" API, as reported on https://developer.apple.com/videos/play/wwdc2024/10139/. My problem is that I have not found any reference to how to integrate the "Passthrough in screen capture" API into my project. Have any of you been able to do this? Thank you
1
0
47
6h
RealityKit's ARView raycast returns nothing
Hello, I have rendered an usdz File using sceneKit's .write() method on the displayed scene. Once I load it on another RealityKit's ARView using the .nonAR mode of the camera, I am trying to use the view's raycast(from:,allowing:,alignment:) method, to get the coordinates on the model. I have applied the collisionComponents when loading the model using the .generateCollisionShapes() function to be able to interact with the modelEntity. However, the raycast result returns nothing. Is there something I am missing for it to work? Thanks!
1
0
39
10h
HandTrackingProvider duplicate updates in 2.0
Anybody try hand tracking provider in 2.0? I'm getting them in 11ms interval, as advertised, but they are duplicate. Here's a print of the timestamps. Problematic for me because I am tracking the last 5 position for a calculation and expect them to be unique. Can't find docs on this anywhere. I understand it's not truly 90 updates a second but predicted pose, however I expected the updates to include predicted poses.
0
0
25
16h
Object Tracking with RealtyView
When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code: RealityView { content in if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(model) } } Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add? Note:I have seen https://developer.apple.com/videos/play/wwdc2024/10101/, but I don't know much about it.
2
0
81
1d
Can the world tracking function be used in Unity?
The project was developed using Unity, and the requirement is to place a virtual model in the real world. When the user leaves the environment or the machine is turned off and then on again, the virtual model is still in its original real position. I found that the worldtracking function of Arkit is useful, but I don't know how to use it in Unity. Is that have any related example projects?
0
0
16
1d
Apple: please unlock "enterprise features" for all visionOS devs!
We are developing apps for visionOS and need the following capabilities for a consumer app: access to the main camera, to let users shoot photos and videos reading QR codes, to trigger the download of additional content So I was really happy when I noticed that visionOS 2.0 has these features. However, I was shocked when I also realized that these capabilities are restricted to enterprise customers only: https://developer.apple.com/videos/play/wwdc2024/10139/ I think that Apple is shooting itself into the foot with these restrictions. I can understand that privacy is important, but these limitations restrict potential use cases for this platform drastically, even in consumer space. IMHO Apple should decide if they want to target consumers in the first place, or if they want to go the Hololens / MagicLeap route and mainly satisfy enterprise customers and their respective devs. With the current setup, Apple is risking to push devs away to other platforms where they have more freedom to create great apps.
1
2
119
3d
Can i use Ultra-Wide AVKit, AvPlayer VisionOS 2 ?
hello i watched WWDC24, Ultra-Wide Mac Display. i wanna use to my player like that Ultra-wide mac Display. i wanna play for mp4 movie file in Ultra-wide mode (like that curved mode) Can i use Ultra-Wide AVKit, VisionOS 2 ? when i check in Apple documentation, AVExperienceController.Experience.expanded, Is this the function(Ultra-wide mode) I think it is? (https://developer.apple.com/documentation/avkit/avexperiencecontroller/experience-swift.enum/expanded#discussion)
0
0
78
4d
How to Detect Gaze & Gesture on Entity
It's a common system interaction to look at an item in SwiftUI and tap to select it. I'm confused how to do the same with ModelEntities. How do I use gaze to select a ModelEntity for context based actions? e.g. look at the green sphere and tap to pull up a menu. Or look in a direction and clap to **** away virtual objects etc. etc. If this is not possible is there a workaround?
1
0
110
6d
Object Tracking Failed to load from object reference
I was following Explore object tracking for visionOS to load an object reference, but got this error: Failed to load reference object from URL: ObjectTrackingProvider.Error(code: referenceObjectLoadingFailed, errorDescription: "The operation couldn’t be completed. (com.apple.arkit error 1101.)", failureReason: "", recoverySuggestion: "" Here is what I have, not sure if it is an code error, or something with the system: private func loadReferenceObject() { Task { // Load the reference object let refObjURL = Bundle.main.url(forResource: "objectTrackerBox", withExtension: ".referenceobject") if let refObjURL = refObjURL { do { let refObj = try await ReferenceObject(from: refObjURL) logMessage = "Reference object loaded successfully: \(refObj)" print(logMessage) } catch { logMessage = "Failed to load reference object from URL: \(error)" print(logMessage) } } else { logMessage = "Failed to find the reference object file." print(logMessage) } } }
1
0
124
1w
New Object Capture Sample Code
Hi, Object Capture's original sample code was released last year, and this year there was a talk about adding area mode to it. The talk links to the old Object Capture code - when can I expect to have the new one with area mode, and is there anything I can help you with to have it published faster? Thanks!
3
1
141
1w
Object Tracking with Rotation of Objects
Hey, In the "Explore object tracking for visionOS" session we explore how a Globe can be tracked, and objects can be anchored to various positions. My question is if the physical Globe is rotated, will the anchored objects also respond to this in real-time? I would like to overlap a virtual map on top of a physical globe, so when the user rotates the physical globe, the virtual map also seamlessly responds. Is this possible using Object Tracking? Thanks
3
1
135
1w
Object Tracking Sample Code
Are you planning on publishing a complete sample code project related to the Explore object tracking for visionOS session (wwdc2024/10101)? The animation at 12:50 where the globe opens up was especially impressive. Seeing how that was done while tracking to the globe would be very interesting. (I realize that we would have to create our own globe object in order for the code to work.)
2
0
162
1w
Questions about WorldTrackedAnchor Resiliency
Background: The app that I am working on lets the user place things in their surroundings and recovers those placements the next time their enter the immersive scene. From the documentation and discussions I have had, World Tracked Anchors are local to the device. My questions are: What happens to these anchors when the user updates their device to the next generation? What happens to these anchors if the user gets an Apple Care replacement? Are they backed up and restored via iCloud? If not, I filed a feedback about it a few months back :D FB13613066
1
0
185
1w
How do I use RoomAnchor?
What I want to do: I want to turn only the walls of a room into RealityKit Entities that I can collide with, or turn into occlusion surfaces. This requires adding and maintaining RealityKit entities that with mesh information from the RoomAnchor. It also requires creating a "collision shape" from the mesh information. What I've explored: A RoomAnchor can provide me MeshAnchor.Geometry's that match only the "wall" portions of a Room. I can use this mesh information to create RealityKit entities and add them to my immersive view. But those Mesh's don't come with UUIDs, so I'm not sure how I could know which entities meshes need to to be updated as the RoomAnchor is updated. As such I just keep adding duplicate wall entities. A RoomAnchor also provides me with the UUIDs of its plane anchors, but no way to connect those to the provided meshes that I've discovered so far. Here is how I add the green walls from the RoomAnchor wall meshes. Note: I don't like that I need to wrap this in a task to satisfy the async nature of making a shape from a mesh. could be stuck with it, though. Warning: this code will keep adding walls, even if there are duplicates and will likely cause performance issues :D. func updateRoom(_ anchor: RoomAnchor) async throws { print("ROOM ID: \(anchor.id)") anchor.geometries(of: .wall).forEach { mesh in Task { let newEntity = Entity() newEntity.components.set(InputTargetComponent()) realityViewContent?.addEntity(newEntity) newEntity.components.set(PlacementUtilities.PlacementSurfaceComponent()) collisionEntities[anchor.id]?.components.set(OpacityComponent(opacity: 0.2)) collisionEntities[anchor.id]?.transform = Transform(matrix: anchor.originFromAnchorTransform) // Generate a mesh for the plane do { let contents = MeshResource.Contents(planeGeometry: mesh) let meshResource = try MeshResource.generate(from: contents) // Make this plane occlude virtual objects behind it. // entity.components.set(ModelComponent(mesh: meshResource, materials: [OcclusionMaterial()])) collisionEntities[anchor.id]?.components.set(ModelComponent(mesh: meshResource, materials: [SimpleMaterial.init(color: .green, roughness: 1.0, isMetallic: false)])) } catch { print("Failed to create a mesh resource for a plane anchor: \(error).") return } // Generate a collision shape for the plane (for object placement and physics). var shape: ShapeResource? = nil do { let vertices = anchor.geometry.vertices.asSIMD3(ofType: Float.self) shape = try await ShapeResource.generateStaticMesh(positions: vertices, faceIndices: anchor.geometry.faces.asUInt16Array()) } catch { print("Failed to create a static mesh for a plane anchor: \(error).") return } if let shape { let collisionGroup = PlaneAnchor.verticalCollisionGroup collisionEntities[anchor.id]?.components.set(CollisionComponent(shapes: [shape], isStatic: true, filter: CollisionFilter(group: collisionGroup, mask: .all))) // The plane needs to be a static physics body so that objects come to rest on the plane. let physicsMaterial = PhysicsMaterialResource.generate() let physics = PhysicsBodyComponent(shapes: [shape], mass: 0.0, material: physicsMaterial, mode: .static) collisionEntities[anchor.id]?.components.set(physics) } collisionEntities[anchor.id]?.components.set(InputTargetComponent()) } } }
1
0
177
1w