Is that possible to get facial expression information like animojis via CoreML or ARkit

As Apple shown yesterday, the Animoji is amazingly retarget human facial movement to emojis,

so is it possible for us to use something like this via CoreML or ARkit, or collecting the user facial expression data for training?

Replies

yup

https://developer.apple.com/videos/play/fall2017/601/

Face Tracking with ARKIT.

I'd be interested to know if it can track other things not just faces.

You could use your phone as Leap motion controller

Thank you! In fact we want to do a research of human facial expression and gaze behavior, so we hope to collect some anonymous data for this.

as for security.

the phone takes a mesh of your face.

why cant you just 3d print the same mesh.

im guessing it might use the infra red camera to see if your dead or not 🙂

If you do figure out how to access this I'd like to provide 3d models

Do you mean you have some 3D models that could be used for this? Do you mind sharing them? I have been working on some iPhoneX face tracking stuff - but I have no 3D modelling skils 😟!