11 Replies
      Latest reply on Mar 14, 2019 11:44 PM by warkwark
      warkwark Level 1 Level 1 (0 points)

        I am trying a feasibility study to make 3D vision.

        Question:

        1) The definition of Coordinate XYZ on video frame(Right-handed coordinate? Where is Origin? Where is the direction of X-axis for? Where is the direction of Y-axis for? Where is the direction of Z-axis for? Abusolute coordinates or relative coordinates?)

         

         

        2) Where are the defined point of Left Eye and Right Eye?(Ex.the center of eyebrows, the center of pupil)

        and the variable names?(leftEyebrow?,rightEyebrow?,leftEye?,rightEye?)

         

         

        3 )Where is the defined point of nose?(Ex.nose crest) and variable name?(noseCrest? nose?)

         

         

        4) What is the aim of Rectangle? Why does Rectangle need?

         

         

        5) Please show me the concrete flow-chart of ViewController.swift for the purpose of realizing and tweaking this to add more 3D functions.

         

        Sincerely

        • Re: About "Tracking the User's Face in Real Time" of Developer
          Claude31 Level 8 Level 8 (5,735 points)

          You should move this thread to ARKit section of the forum.

           

          Good luck.

          • Re: About "Tracking the User's Face in Real Time" of Developer
            gchiste Apple Staff Apple Staff (60 points)

            Hello,

             

            To clarify, is this a question about ARFaceAnchor, or is this about VNFaceObservation?

             

            If it is about VNFaceObservation as I suspect, then you should move this thread to the Vision section of the forum, where I will be happy to attempt to answer your questions!

              • Re: About "Tracking the User's Face in Real Time" of Developer
                warkwark Level 1 Level 1 (0 points)

                Gchiste san, thank you very much for having your precious time to me.

                Speaking directly, I think it's much rapid and profitable between Apple and me (Developer) to get the solutions from ARFaceAnchor (who are the presentators of WWDC17/506).

                Can you arrange and coodinate us?

                1 of 1 people found this helpful
                  • Re: About "Tracking the User's Face in Real Time" of Developer
                    gchiste Apple Staff Apple Staff (60 points)

                    Hello, it is still unclear which api you want to use here.  You mention ARFaceAnchor, but the WWDC video that you linked to is about VNFaceLandmarks?

                     

                    Please let me know which api you are using and I will be happy to answer some of your questions.

                      • Re: About "Tracking the User's Face in Real Time" of Developer
                        warkwark Level 1 Level 1 (0 points)

                        Gchiste san, thank you very much.

                        I think that the WWDC video of "WWDC17/section507" includes ARFaceLandmarks anyway.

                        May you arrange and coodinate Apple's experts and me sincerely for the purpose of progressing?

                        • Re: About "Tracking the User's Face in Real Time" of Developer
                          warkwark Level 1 Level 1 (0 points)

                          Hello!

                          Are you very busy?

                          May you explain about your progress inside Apple if any?

                            • Re: About "Tracking the User's Face in Real Time" of Developer
                              gchiste Apple Staff Apple Staff (60 points)

                              It still is not clear to me if you are asking about VNFaceLandmarks, or ARFaceAnchor, but I am just going to go ahead and assume you are asking about ARFaceAnchor since this post is in the ARKit section of the forum.

                               

                              In the order of your original questions:

                               

                              1.  As described in the figures on this page of the documentation, in a session running the ARFaceTrackingConfiguration the origin is "centered behind the face".

                               

                              2.  Again, check the diagrams on this page of the documentation: leftEyeTransform

                               

                              3.  ARFaceAnchors do not give you a point that directly corresponds to the nose

                               

                              4.  Not sure what you are asking here.

                               

                              5.  This documentation talks about view controllers in a general sense: https://developer.apple.com/library/archive/featuredarticles/ViewControllerPGforiPhoneOS/

                                • Re: About "Tracking the User's Face in Real Time" of Developer
                                  warkwark Level 1 Level 1 (0 points)

                                  Hello! Gchiste, thank you very much for the explanations.

                                  I am trying several tests of sample app "Tracking the User's Face in Real Time" with my iPhone.

                                  I added "print(results)" to ViewController.swift at the step-545th for minitoring.

                                   

                                                 ......................................................(ViewController.swift)...............................................................

                                                 ......................................................................................................................................................

                                                  guard let landmarksRequest = request as? VNDetectFaceLandmarksRequest,

                                                      let results = landmarksRequest.results as? [VNFaceObservation] else {

                                                          return

                                                  }

                                                  print(results) //********************************************************************* 2019.2.28 Try Test by addition.

                                                  // Perform all UI updates (drawing) on the main queue, not the background queue on which this handler is being called.

                                                  DispatchQueue.main.async {

                                                      self.drawFaceObservations(results)

                                                  }

                                                 .....................................................................................................................................................

                                                 .....................................................................................................................................................

                                   

                                  The reacted Debug area in Real time is the following.

                                                ......................................................................................................................................................

                                                [<VNFaceObservation: 0x131e11da0> 36E02117-F477-47AF-BD6A-FC94E2DEBC8D, revision 0, 1.000000 [0.204768 0.0890206          0.626646 0.352488] ID=0 landmarks 0.499072]

                                                [<VNFaceObservation: 0x131d084f0> D0B5286F-7A77-45AE-94DA-1D8ABE0DFD94, revision 0, 1.000000 [0.212853 0.0844527 0.633526 0.356358] ID=0 landmarks 0.524259]

                                                ......................................................................................................................................................

                                   

                                  May you explain these meaning of ech item?  Specially I am assuming that [0.204768 0.0890206     0.626646 0.352488]  is  [ x,y,width,height] of the rectangle. Is this correct?

                                   

                                  Respectfully.