We need sample code in how to create an ANN to recognize HAND digit positions and an API to facilitate creating and using such.

I saw this title and was exited until I realized it referred to numerical digits:

Training a neural network to recognize digits https://developer.apple.com/documentation/accelerate/training_a_neural_network_to_recognize_digits

We need the equivalent to help us recognize hand positions (and an API that will use the entire array of joints if it doesn't exist all ready, to use as inputs for such a neural network recognizer.

The obvious use case is to recognize finger spelling. One issue is that some letters are spelt by a motion, not merely by a hand position (see 'J' and 'Z') —

— so ideally such a sample ANN recognizer should also handle temporal/animation issues.

That. last segues into the big brother of such a recognizer:

one that recognizes both ASL fingerspelling AND ASL gestures

Should Apple create a sample recognizer for both, then these could be used as templates for recognizing pretty much any conceivable set of finger, hand & arm positions/gestures. Training of suchsets is also an issue. Appleshould help developers devise shareable libraries for trained and trainable gesture recognizer ANNs.

But from what I've seen the current API for joints is NOT up to task, or at least, it would be a major project just to create a such a system, even though it has wide applications, both for ASL itself AND in the wider use-case of generic position and gesture recognition.

Its already available on teh web, so something far better than this should simply be built into the system:

https://coolmaterial.com/tech/fingerspelling-machine-learning-to-teach-abcs-american-sign-language/

[note that website will not work with Safari]

We need sample code in how to create an ANN to recognize HAND digit positions and an API to facilitate creating and using such.
 
 
Q