The Goal
My goal is to place an item where the user taps on a plane, and have that item match the outward facing normal-vector where the user tapped.
In beta 3 a 3D Spatial Tap Gesture now returns an accurate Location3D, so determining the position to place an item is working great. I simply do:
let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene)
The Problem
Now, I notice that my entities aren't oriented correctly:
The placed item always 'faces' the camera. So if the camera isn't looking straight on the target plane, then the orientation of the new entity is off.
If I retrieve the transform of my newly placed item it says the rotation relative to 'nil' is 0,0,0, which.... doesn't look correct?
I know I'm dealing with different Coordinate systems of the plane being tapped, the world coordinate system, and the item being placed and I'm getting a bit lost in it all. Not to mention my API intuition is still pretty low, so quats are still new to me.
So, I'm curious, what rotation information can I use to "correct" the placed entity's orientation?
What I tried:
I've tried investigating the tap-target-entity like so:
let rotationRelativeToWorld = value.entity.convert(transform: value.entity.transform, to: nil).rotation
I believe this returns the rotation of the "plane entity" the user tapped, relative to the world.
While that get's me the following, I'm not sure if it's useful?
rotationRelativeToWorld
:
▿ simd_quatf(real: 0.7071068, imag: SIMD3<Float>(-0.7071067, 6.600024e-14, 6.600024e-14))
▿ vector : SIMD4<Float>(-0.7071067, 6.600024e-14, 6.600024e-14, 0.7071068)
If anyone has better intuition than me about the coordinated spaces involved, I would appreciate some help. Thanks!
My Solution
You can get the location of the tap gesture on an entity with:
let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene)
You can grab he orientation of the tapped plane with
let rotation = value.entity.orientation(relativeTo: nil)
I want to track this location so it can be persisted in the WorldTrackingProvider (whenever it works??? ) with a world anchor. I do so by making a world anchor with:
let pose = Pose3D(position: worldPosition, rotation: rotation)
let worldAnchor = WorldAnchor(originFromAnchorTransform: simd_float4x4(pose))
When you get the anchor from your world tracking listener you can apply it's transform (which contains position and orientation) to an entity like so:
entityForAnchor.transform = Transform(matrix: anchor.originFromAnchorTransform)
If you've got another way, or a way that has a more intuitive use of the API topology please share. This was cobbled together with a ton of trial and error.