A second post on the same topic, as I feel I may have over complicated the earlier one.
I essentially am performing object tracking inside Reality Composer Pro and adding a digital entity to the tracked object. I now want to get the coordinates of this digital entity inside Xcode..
Secondly, can I track more than 1 object inside the same scene? For example if I want to find a spanner and a screwdriver amongst a bunch of tools laid out on the table, and spawn an arrow on top of the spanner and the screwdriver, and then get the coordinates of the arrows that I spawn, how can I go about this?
Hi @adityach
To protect people's privacy you need to obtain permission to read an entity's position when it's an AnchorEntity (in this case for an Object) or contained in an AnchorEntity.
Add an entry for NSWorldSensingUsageDescription
to your app’s information property list to provide a usage description that explains how your app uses the position.
Start a SpatialTrackingSession configured to track objects.
.task {
let configuration = SpatialTrackingSession.Configuration(
tracking: [.object])
// Declare this on your view
session = SpatialTrackingSession()
await session.run(configuration)
}
Find the entity in your Reality Composer Pro scene then ask for its position relative to a given entity (in this case I pass nil so the coordinates are in world space). Here's code to do that. Note until the SpatialTrackingSession
is running the AnchorEntity will report its position as [0, 0, 0].
// named: is the name you gave the entity in Reality Composer Pro.
if let objectEntity = immersiveContentEntity.findEntity(named: "MyObject") {
let position = objectEntity.convert(position: objectEntity.position, to: nil)
}