2 Replies
      Latest reply on Feb 7, 2020 2:11 PM by pranavrajiv
      uson1x Level 1 Level 1 (0 points)

        iOS 12 brings us new API for eye tracking and I'm currently trying to figure out how to estimate the on-screen position of where the person is looking at.


        The result I want to achieve is an (x, y) point in screen coordinates, which I'll be later able to use to determine a specific UIView using hitTest or something like that.


        I was able to use these lines below to figure out the location of an eye in screen coordinates, however, I'm struggling to convert `lookAtPoint` into screen coordinates.


        func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
            guard let faceAnchor = anchor as? ARFaceAnchor else { return }
            let leftEyeTransform = node.simdConvertTransform(faceAnchor.leftEyeTransform, to: nil)
            let position = leftEyeTransform.position()
            let projectedPosition = renderer.projectPoint(position)
            let point = CGPoint(x: Double(projectedPosition.x), y: Double(projectedPosition.y))
        //     point here is in the screen 


        It feels like I'm missing something obvious, but so far I wasn't able to project `lookAtPoint` into the screen plane.

        Would appreciate any help or hints on how to attack this problem.