When I wanted to call the Reality Composer Pro scene containing Object Tracking, I tried the following code:
RealityView { content in
if let model = try? await Entity(named: "Scene", in: realityKitContentBundle) {
content.add(model)
}
}
Obviously, this is wrong. We need to add some configurations that can enable Object Tracking to Reality View. What do we need to add?
Note:I have seen https://developer.apple.com/videos/play/wwdc2024/10101/, but I don't know much about it.
Hi Lijiaxu,
I hope you enjoyed the WWDC session!
There’s no need to configure anything else. Your code is sufficient as long as you’ve set up Object Tracking in Reality Composer Pro.
To configure this in Reality Composer Pro, add the AnchoringComponent to an entity, select the “object” target within that component, and choose your .referenceObject file.
For testing purposes, you can add some content (like a red sphere) as a child of the entity with the AnchoringComponent.
If you need more details, feel free to refer to the session or ask additional questions in this thread.
Hope that helps!