VNDetectTrajectoryRequest no results (beta6)

The video input is of a golf ball rolling through the camera (cant provide video links in this forum)

I have seen the similar post.

I've implemented it similar to the implementation in the wwdc20 'Action and Vision' app

Code Block swift
private lazy var detectTrajectoryRequest: VNDetectTrajectoriesRequest! =
    VNDetectTrajectoriesRequest(frameAnalysisSpacing: .zero, trajectoryLength: 5)


Code Block swift
try visionHandler.perform([self.detectTrajectoryRequest])
          if let results = self.detectTrajectoryRequest.results {
            DispatchQueue.main.async {
/* do stuff here */
              self.processObservations(controller, results)
            }
          }

I have not set a min or max size in order to get more results, the region of interest is set to just inside the camera (slight padding).

I am on Xcode 12.0 beta 6, using an iphone XS on iOS 14 beta 6.

Is there something wrong with my implementation? What can I do to improve this?

thanks

Also. the 'Action and Vision' example app doesnt work, so im unable to compare as there is not working example of DetectTrajectory

Accepted Answer
In regards to the Action and Vision example app, it will not work in the simulator. According to the Readme file, it needs an A12 or later processor. (Maybe it works with the DTK? Don't know as I don't have one.) So you will have to run it on your XS.

It also did not work prior to beta 5, but AFAIK should work with beta 6 on your XS. The other problem I had was the sample movie had too little time ahead of the first bean bag toss (at least for my XS Max) and so scoring did not start until the second toss. And had too little time after the last toss. To work around those I edited the sample video to repeat more time at the start, and repeat more time at the end to it can score all 8 bag tosses and display the summary before the movie ends.
There is a mismatch in the keys defined by Apple
Code Block
VNRecognizedPointKey.handLandmarkKeyThumbTIP = "VNHLKTTIP"


while the actual coordinates are being set into :

Code Block
[C.VNRecognizedPointKey(_rawValue: VNHLKTMP): [0.770814; 0.408112], C.VNRecognizedPointKey(_rawValue: VNHLKTTip): [0.575923; 0.611163], C.VNRecognizedPointKey(_rawValue: VNHLKTCMC): [0.886506; 0.308010], C.VNRecognizedPointKey(_rawValue: VNHLKTIP): [0.665807; 0.469130]]

as you can see , inside vision framework the name for finger tip for example is : "VNHLKTIP" while the key exposed by the SDK = "VNHLKTTIP"
(lldb) 
Marking @Doug_H's answer as correct as it was helpful.

Extra info for those that may also have this issue.

The Action and Vision example app worked with beta6, a fresh download (don't know if they had updated it) and some code tweaks (namely the epsilon error).

After getting the the Action and Vision app to work, plotting their ROI shows that it is only a window, and likely by setting the trajectory ROI to the entire screen in my app was causing issues.

Making my ROI a window similar to the A&V app gets results.

If you are looking for a trajectory of something rolling/moving closely to the ground, I found that having the camera setup higher than expected increases performance (from chest to eye level seems best).

Understanding the UIView coordinates vs camera orientation was also an issue I had.
VNDetectTrajectoryRequest no results (beta6)
 
 
Q