Determine point of screen where user is looking at using lookAtPoint or leftEyeTransform/rightEyeTransform

iOS 12 brings us new API for eye tracking and I'm currently trying to figure out how to estimate the on-screen position of where the person is looking at.


The result I want to achieve is an (x, y) point in screen coordinates, which I'll be later able to use to determine a specific UIView using hitTest or something like that.


I was able to use these lines below to figure out the location of an eye in screen coordinates, however, I'm struggling to convert `lookAtPoint` into screen coordinates.


func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceAnchor = anchor as? ARFaceAnchor else { return }
    let leftEyeTransform = node.simdConvertTransform(faceAnchor.leftEyeTransform, to: nil)
    let position = leftEyeTransform.position()
    let projectedPosition = renderer.projectPoint(position)
    let point = CGPoint(x: Double(projectedPosition.x), y: Double(projectedPosition.y))
//     point here is in the screen 
}


It feels like I'm missing something obvious, but so far I wasn't able to project `lookAtPoint` into the screen plane.

Would appreciate any help or hints on how to attack this problem.

Hey,
did you find out anything yet?
Best,
Klemens

HI,

I'm also struggling with projecting lookAtPoint on the screen. Any update on your initial question?

Any suggestions appreciated.


Thank You



Best,

Pranav

Hi, Please did anyone find the answer to this question? Determining the point on screen where user is looking at using lookAtPoint or leftEyeTransform / rightEyeTransform?
Determine point of screen where user is looking at using lookAtPoint or leftEyeTransform/rightEyeTransform
 
 
Q