@Graphics and Games Engineer Thanks for that. The template, however, uses exclusively Swift code, not the C api shown in the talk. Could we also have a C/Objective-C template that only uses the minimal amount of Swift UI? The function names in the Swift API example seem very different too, so the correspondence to the C api isn’t obvious.
Post
Replies
Boosts
Views
Activity
@GeoffAtApple Might it eventually be re-enabled?
Also, this isn't for just one texture. It could be for hundreds/N. So it's not meant to be a single drawable thing.
@Graphics and Games Engineer Respectfully, I knew this would be the first suggestion, and it won't work. My MTLTexture is generated by a separate rendering subsystem, not something I am requesting. I do not want a new drawable unless there is a 0-cost way to substitute that drawable with another MTLTexture. Can you advise further?
(I reply so quickly because I have notifications. :) ) Also, if I am misunderstanding, it's partly since I couldn't find examples of using DrawableQueue.
Thank you for clarifying!
I think that’s why Apple partnered with Unity.
@Developer Tools Engineer The terminology for the different spaces is still a little unclear. Is full-hand tracking available with passthrough on when it’s just a single app, too? Not just VR.
I’m not sure you’re the person to answer if you do not know the difference between the various immersive modes. No, passthrough (video of the real world) is not always on. No, not every mode is same. Each has different limits. @mike_at_helium I am hoping someone from the team can speak to the tracking limits in the mode that doesn’t have the ~1.5m limit.
@eskimo Why not push the over-the-limit string to stdio somehow so you don’t have to choose?
@renancoelho It seems to work here: https://ebidel.github.io/demos/offscreencanvas.html
@mike_at_helium The title and first post refer to passthrough mode.
It doesn’t stop people from pressing the issue for future versions of the OS. It’s worth discussing. Balancing privacy and functionality is an ongoing engineering and research problem, and it makes sense for v1 of the OS to be the most restricted.
I would argue that it‘s what you do with the data that counts. On the one hand, one can create an advertiser-controlled nightmare. On the other, tons of research labs and medical practioners are adopting XR technologies and eye tracking to help people. The core problem is to make sure the data are in the person’s control and can’t be so easily abused for surveillance, supposing the system is compromised. Maybe limited network access and sandboxing?
@Graphics and Games Engineer I’m seeing (pun intended) a lot of medical and research labs needing accurate eye tracking data to benefit the researchers, medical practioners, and patients. e.g. AR-supported surgery and patient analytics. That said, always-on unlimited eye-tracking is scary. One thought might be to add a PLIST for eye tracking AND an additional rigorous review for app store submissions. The key is for the user in control. I hope Apple is thinking about these things.
@Claude31 I am not sure what you mean. Apple has said clearly that they're not exposing eye tracking information. (It's a bit of a let-down exactly since it makes @aberens 's use case impossible.) @aberens I recommend filing a feature request in the feedback system with a clearly-worded description of your use case and the benefits overall. If Apple ever plans to try opening-up access to the eye-tracking data, they're going to want to see strong feedback and do work to ensure user consent.