HandTracking

Hi guys I'm currently developing a game for the Vision Pro and i'm trying to figure out how the hand tracking works so I can make a superpower appear when the user looks at their hand and widens it. But im really struggling to wrap my head around the whole concept and how to implement it in my code.

Is there anything out there (other than apple doc) or anyone who could help me shed some light on the whole idea and how I could actually usefully implement it?

would be much appreciated Thanks

Hi @HKone,

Do you have a specific idea of what effect you want to make? I'll attempt to walk you through how I'd actually create what you are describing (a superpower effect in the user's hand that they can control), but if there's something specific you have in mind please let me know and I can help you with that.

I'll link to a lot of existing Apple documentation below, but if there's any form of tutorial or help you'd like to receive from Apple regarding how to develop for the Vision Pro, please file a request using Feedback Assistant.

In general, hand tracking on visionOS is accomplished with ARKit. At a very high level, to create a superpower effect that appears in the user's hand, first you need to begin a SpatialTrackingSession, which will require permission from the user before you can receive transform data about the user's hands. Next, I recommend creating an AnchorEntity tracked to your user's palm. For the superpower visuals, you can create a ShaderGraphMaterial in RCP and check out this forum answer for a walkthrough on how to create a similar effect to one shown at WWDC. Then, apply that material to a ModelEntity parented to the AnchorEntity you created.

To enable/disable the effect when the user is opening their hand, you will need to track the user's hand gestures. The exact gesture will be specific to your app, but I recommend looking at the source code for the Happy Beam sample, since it contains an implementation of custom hand gestures. You mentioned you want the user to look at their hand to activate the power up, but you might not be able to get information about the users gaze in the way that you expect. You can get the selected entity and position data when a user performs a SpatialTapGesture, but per-frame gaze ray info is not provided for privacy reasons. Tracking the orientation of the device in relation to the user's palm might be a sufficient work around in this case.

Let me know if that helps! That Happy Beam sample might be close to what you're trying to create: it's a game where the user shoots a beam from their hands when they perform a certain gesture, but you'll probably want to modify the type of gesture and effect for your app.

HandTracking
 
 
Q