The video input is of a golf ball rolling through the camera (cant provide video links in this forum)
I have seen the similar post.
I've implemented it similar to the implementation in the wwdc20 'Action and Vision' app
I have not set a min or max size in order to get more results, the region of interest is set to just inside the camera (slight padding).
I am on Xcode 12.0 beta 6, using an iphone XS on iOS 14 beta 6.
Is there something wrong with my implementation? What can I do to improve this?
thanks
Also. the 'Action and Vision' example app doesnt work, so im unable to compare as there is not working example of DetectTrajectory
I have seen the similar post.
I've implemented it similar to the implementation in the wwdc20 'Action and Vision' app
Code Block swift private lazy var detectTrajectoryRequest: VNDetectTrajectoriesRequest! = VNDetectTrajectoriesRequest(frameAnalysisSpacing: .zero, trajectoryLength: 5)
Code Block swift try visionHandler.perform([self.detectTrajectoryRequest]) if let results = self.detectTrajectoryRequest.results { DispatchQueue.main.async { /* do stuff here */ self.processObservations(controller, results) } }
I have not set a min or max size in order to get more results, the region of interest is set to just inside the camera (slight padding).
I am on Xcode 12.0 beta 6, using an iphone XS on iOS 14 beta 6.
Is there something wrong with my implementation? What can I do to improve this?
thanks
Also. the 'Action and Vision' example app doesnt work, so im unable to compare as there is not working example of DetectTrajectory