Request for gaze data in fully immersive Metal apps

Hi,

We are trying to port our Unity app from other XR devices to Vision Pro. Thus it's way easier for us to use the Metal rendering layer, fully immersive. And to stay true to the platform, we want to keep the gaze/pinch interaction system.

But we just noticed that, unlike Polyspatial XR apps, VisionOS XR in Metal does not provide gaze info unless the user is actively pinching... Which forbids any attempt to give visual feedback on what they are looking at (buttons, etc).

Is this planned in Apple's roadmap ?

Thanks

Hi, You are right. Fortunately, visionOS offers alternatives: device tracking and hand tracking. Others can see where you are looking, so there are no privacy issues. A little tip: the center of the screen where the device is pointing is inconvenient for you to see. It is better if you set an angle of 10-15 degrees downwards to the device direction.

Hello @Sashell:

For Metal & Compositor Services, take a look at onSpatialEvent for access to rudimentary event types, enumerated here. An example of its usage can be found in Drawing fully immersive content using Metal.

If you have something else in mind, please let us know — either here in this thread, or by filing an enhancement request using Feedback Assistant.

Thanks, Steve

Request for gaze data in fully immersive Metal apps
 
 
Q