I have an iOS ARKit app I'm trying to convert to visionOS, but not having luck detecting gestures on my Reality Composer model entities the way I could with ARView. All the documentation I could find on gestures in visionOS seems focused on RealityViews.
How to get gestures to work on model entities in visionOS?
Did you check out the Happy Beam sample project? You can pretty much reference that to do what you wanna do.
That creates an involved custom heart gesture with skeletons etc. What about simple long-tap etc?