How to determine point on phone screen where a user is looking at, by using data from ARKit FaceAnchor

I'm working on an App with a feature that determines and marks the point a user is looking at (i.e in screen coordinates)

I have successfully set my ARKit configurations to retrieve the leftEyeTransform, rightEyeTransform and lookAtPoint. My challenge is correcting converting this data to CGPoint coordinates to present the point the user is looking at.

I have attempted using "renderer.projectPoint" to derive the screen coordinates from the leftEyeTransform, rightEyeTransform and lookAtPoint, but haven't been able to get it to work correctly.

I think I'm missing something, please any suggestions or points will be greatly appreciated.

Thank you in anticipation.
but haven't been able to get it to work correctly.

Could you explain:
  • what you get

  • how far it is from what you expect

  • in which use case ? (e.g., when user is not looking straight ahead)

  • the code where you compute from lookAtPoint to point on screen

I would try to :
  • 1. get the device coordinates in world coordinates (using ARConfiguration.WorldAlignment.gravity)

  • 2. find the face coordinates in the world coordinates (with ARFaceAnchor.transform property)

  • you have lookAtPoint which gives the direction in face coordinates

  • transforming with 1 and 2, you should get the point looked at in device coordinates.

This threads links to some useful resources
https://developer.apple.com/forums/thread/118048
What I get isn't constant but its the typical position of the "face/ face mesh" on the screen. This position only adjust with movement of the face and not movement of the eyes itself. Also noticed this position also change with the position of the device.

// Here are my codes
    func renderer( _ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
      guard let faceAnchor = anchor as? ARFaceAnchor,
        let faceGeometry = node.geometry as? ARSCNFaceGeometry else {
          return
      }
      faceGeometry.update(from: faceAnchor.geometry)

      let lookAtPointPosition = SCNVector3(faceAnchor.lookAtPoint) // also tried with LeftTransform & rightTransform
      let projectionPoint = renderer.projectPoint((lookAtPointPosition))
      let pointOnScreen = CGPoint(x: Double(projectionPoint.x), y: Double(projectionPoint.y))

I'm not experienced with ARKit.

But I think you should have to do some coordinate space transformation as I imagined in the previous post.
Sorry if that's probably not helpful enough.
No worries, thanks anyways.
How to determine point on phone screen where a user is looking at, by using data from ARKit FaceAnchor
 
 
Q