Eye Tracking with ARKit

Hi there,
Please I need some help with ARKit. I'm currently working on an App that tracks the point a user's eyes are looking at on the device screen. I have been able to successfully do this, by performing a hitTest using the left and right eye transform and a set targetNode on the screen.

I'm able to get these points (i.e x, y in screen coordinates), my problem is they're not stable and a slight movement of the head or the device significantly affect these coordinates.

My question: is there a way to improve the accuracy of data retrieved while tracking the eyes (i.e without enforcing the user keep their head still). Or is there a better way to track the eyes and then the "LookAtPoint" and eventually convert it to screen coordinates (2D).

It's not clear what I'm doing wrong, I can see the Face is tracked accurately because moving the device or head doesn't affect the face mesh I set up.

Please any helpful suggestions or ideas are welcomed.

Thanks in anticipation.

Hello John,

I have the same problem here. Did you find out what's going on?

Thank you.

Eye Tracking with ARKit
 
 
Q