Hello,
I am reaching out for some assistance regarding integrating a CoreML action classifier into a SwiftUI app. Specifically, I am trying to implement this classifier to work with the live camera of the device. I have been doing some research, but unfortunately, I have not been able to find any relevant information on this topic.
I was wondering if you could provide me with any examples, resources, or information that could help me achieve this integration? Any guidance you can offer would be greatly appreciated.
Thank you in advance for your help and support.
Post
Replies
Boosts
Views
Activity
Hello, I am looking for a way of running a ARBodyTrackingConfiguration on a video instead of the back camera. Is there any way I can do this? Thank you
Hello, I have a USDZ file placed in a SceneView, and I am searching for a way to interact with parts of that USDZ inside my app.
For example, I have a person made up of 6 parts (arm_right, arm_left, leg_left, leg_right, body and head) and I would like to select arm_left and see its width.
How can I achieve this inside my app?
Hello everyone .
I have a button which is in a stack view. When my button is tapped it animates but it doesn’t perform the animation completely because it goes outside the stack view.
Do you have any suggestions on how i can go outside my view with my animation? Also i am not able to get rid of the stack view.
Thank you!!