SpatialTapGesture and collision surface's normal?

I see example code converting the results of a SpatialTap to a SIMD3<Float> location. For example, from WWDC session Meet ARKit for spatial computing:

let location3D = value.convert(value.location3D, from: .global, to: .scene)

What I really want is a simd_float4x4 that includes orientation of the surface that the tap gesture/cast collided with?

My goal is to place an object with its Y-axis along the normal of the surface that was tapped.

For example, in the referenced WWDC session, they create a CollisionComponent from the MeshAnchor data. If that mesh data is covering a curved couch cushion, I would like the normal from that curved cushion (i.e., the closest triangle approximating it).

Is this possible?

My planned fallback is to only use planes for collision surfaces for tap gestures, extract the tap gesture value's entity (which I am hoping is the plane), and grab its transform for the orientation information.

I am hoping Apple has a simple function call that is more general than my fallback approach.

Replies

Yes, simd_float4x4 is correct.

I know that AR-related functions defined by Apple, return a simd_float4x4 type that includes not only the x, y, and z axes, but also other information such as angles.

So you need to extract the desired information value from the simd_float4x4 array and use it.

  • This answer is not absolute and is just my opinion. When I was developing AR App, I could only find the above method.
  • Thanks. I've made some progress, using the plane anchor's transform, but I have some issues I am still working with. Namely, the object seems slightly off from where I gaze and tap. Maybe I need to recalibrate my eye tracking on the device.

Add a Comment